debian/0000755000000000000000000000000011734462123007171 5ustar debian/radare-gtk.manpages0000644000000000000000000000011711734462123012726 0ustar debian/tmp/usr/share/man/man1/gradare.1 debian/tmp/usr/share/man/man1/radare.1 debian/cddl.txt0000644000000000000000000004324411734462123010647 0ustar COMMON DEVELOPMENT AND DISTRIBUTION LICENSE (CDDL) Version 1.0 1. Definitions. 1.1. "Contributor" means each individual or entity that creates or contributes to the creation of Modifications. 1.2. "Contributor Version" means the combination of the Original Software, prior Modifications used by a Contributor (if any), and the Modifications made by that particular Contributor. 1.3. "Covered Software" means (a) the Original Software, or (b) Modifications, or (c) the combination of files containing Original Software with files containing Modifications, in each case including portions thereof. 1.4. "Executable" means the Covered Software in any form other than Source Code. 1.5. "Initial Developer" means the individual or entity that first makes Original Software available under this License. 1.6. "Larger Work" means a work which combines Covered Software or portions thereof with code not governed by the terms of this License. 1.7. "License" means this document. 1.8. "Licensable" means having the right to grant, to the maximum extent possible, whether at the time of the initial grant or subsequently acquired, any and all of the rights conveyed herein. 1.9. "Modifications" means the Source Code and Executable form of any of the following: A. Any file that results from an addition to, deletion from or modification of the contents of a file containing Original Software or previous Modifications; B. Any new file that contains any part of the Original Software or previous Modification; or C. Any new file that is contributed or otherwise made available under the terms of this License. 1.10. "Original Software" means the Source Code and Executable form of computer software code that is originally released under this License. 1.11. "Patent Claims" means any patent claim(s), now owned or hereafter acquired, including without limitation, method, process, and apparatus claims, in any patent Licensable by grantor. 1.12. "Source Code" means (a) the common form of computer software code in which modifications are made and (b) associated documentation included in or with such code. 1.13. "You" (or "Your") means an individual or a legal entity exercising rights under, and complying with all of the terms of, this License. For legal entities, "You" includes any entity which controls, is controlled by, or is under common control with You. For purposes of this definition, "control" means (a) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (b) ownership of more than fifty percent (50%) of the outstanding shares or beneficial ownership of such entity. 2. License Grants. 2.1. The Initial Developer Grant. Conditioned upon Your compliance with Section 3.1 below and subject to third party intellectual property claims, the Initial Developer hereby grants You a world-wide, royalty-free, non-exclusive license: (a) under intellectual property rights (other than patent or trademark) Licensable by Initial Developer, to use, reproduce, modify, display, perform, sublicense and distribute the Original Software (or portions thereof), with or without Modifications, and/or as part of a Larger Work; and (b) under Patent Claims infringed by the making, using or selling of Original Software, to make, have made, use, practice, sell, and offer for sale, and/or otherwise dispose of the Original Software (or portions thereof). (c) The licenses granted in Sections 2.1(a) and (b) are effective on the date Initial Developer first distributes or otherwise makes the Original Software available to a third party under the terms of this License. (d) Notwithstanding Section 2.1(b) above, no patent license is granted: (1) for code that You delete from the Original Software, or (2) for infringements caused by: (i) the modification of the Original Software, or (ii) the combination of the Original Software with other software or devices. 2.2. Contributor Grant. Conditioned upon Your compliance with Section 3.1 below and subject to third party intellectual property claims, each Contributor hereby grants You a world-wide, royalty-free, non-exclusive license: (a) under intellectual property rights (other than patent or trademark) Licensable by Contributor to use, reproduce, modify, display, perform, sublicense and distribute the Modifications created by such Contributor (or portions thereof), either on an unmodified basis, with other Modifications, as Covered Software and/or as part of a Larger Work; and (b) under Patent Claims infringed by the making, using, or selling of Modifications made by that Contributor either alone and/or in combination with its Contributor Version (or portions of such combination), to make, use, sell, offer for sale, have made, and/or otherwise dispose of: (1) Modifications made by that Contributor (or portions thereof); and (2) the combination of Modifications made by that Contributor with its Contributor Version (or portions of such combination). (c) The licenses granted in Sections 2.2(a) and 2.2(b) are effective on the date Contributor first distributes or otherwise makes the Modifications available to a third party. (d) Notwithstanding Section 2.2(b) above, no patent license is granted: (1) for any code that Contributor has deleted from the Contributor Version; (2) for infringements caused by: (i) third party modifications of Contributor Version, or (ii) the combination of Modifications made by that Contributor with other software (except as part of the Contributor Version) or other devices; or (3) under Patent Claims infringed by Covered Software in the absence of Modifications made by that Contributor. 3. Distribution Obligations. 3.1. Availability of Source Code. Any Covered Software that You distribute or otherwise make available in Executable form must also be made available in Source Code form and that Source Code form must be distributed only under the terms of this License. You must include a copy of this License with every copy of the Source Code form of the Covered Software You distribute or otherwise make available. You must inform recipients of any such Covered Software in Executable form as to how they can obtain such Covered Software in Source Code form in a reasonable manner on or through a medium customarily used for software exchange. 3.2. Modifications. The Modifications that You create or to which You contribute are governed by the terms of this License. You represent that You believe Your Modifications are Your original creation(s) and/or You have sufficient rights to grant the rights conveyed by this License. 3.3. Required Notices. You must include a notice in each of Your Modifications that identifies You as the Contributor of the Modification. You may not remove or alter any copyright, patent or trademark notices contained within the Covered Software, or any notices of licensing or any descriptive text giving attribution to any Contributor or the Initial Developer. 3.4. Application of Additional Terms. You may not offer or impose any terms on any Covered Software in Source Code form that alters or restricts the applicable version of this License or the recipients' rights hereunder. You may choose to offer, and to charge a fee for, warranty, support, indemnity or liability obligations to one or more recipients of Covered Software. However, you may do so only on Your own behalf, and not on behalf of the Initial Developer or any Contributor. You must make it absolutely clear that any such warranty, support, indemnity or liability obligation is offered by You alone, and You hereby agree to indemnify the Initial Developer and every Contributor for any liability incurred by the Initial Developer or such Contributor as a result of warranty, support, indemnity or liability terms You offer. 3.5. Distribution of Executable Versions. You may distribute the Executable form of the Covered Software under the terms of this License or under the terms of a license of Your choice, which may contain terms different from this License, provided that You are in compliance with the terms of this License and that the license for the Executable form does not attempt to limit or alter the recipient's rights in the Source Code form from the rights set forth in this License. If You distribute the Covered Software in Executable form under a different license, You must make it absolutely clear that any terms which differ from this License are offered by You alone, not by the Initial Developer or Contributor. You hereby agree to indemnify the Initial Developer and every Contributor for any liability incurred by the Initial Developer or such Contributor as a result of any such terms You offer. 3.6. Larger Works. You may create a Larger Work by combining Covered Software with other code not governed by the terms of this License and distribute the Larger Work as a single product. In such a case, You must make sure the requirements of this License are fulfilled for the Covered Software. 4. Versions of the License. 4.1. New Versions. Sun Microsystems, Inc. is the initial license steward and may publish revised and/or new versions of this License from time to time. Each version will be given a distinguishing version number. Except as provided in Section 4.3, no one other than the license steward has the right to modify this License. 4.2. Effect of New Versions. You may always continue to use, distribute or otherwise make the Covered Software available under the terms of the version of the License under which You originally received the Covered Software. If the Initial Developer includes a notice in the Original Software prohibiting it from being distributed or otherwise made available under any subsequent version of the License, You must distribute and make the Covered Software available under the terms of the version of the License under which You originally received the Covered Software. Otherwise, You may also choose to use, distribute or otherwise make the Covered Software available under the terms of any subsequent version of the License published by the license steward. 4.3. Modified Versions. When You are an Initial Developer and You want to create a new license for Your Original Software, You may create and use a modified version of this License if You: (a) rename the license and remove any references to the name of the license steward (except to note that the license differs from this License); and (b) otherwise make it clear that the license contains terms which differ from this License. 5. DISCLAIMER OF WARRANTY. COVERED SOFTWARE IS PROVIDED UNDER THIS LICENSE ON AN "AS IS" BASIS, WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, WITHOUT LIMITATION, WARRANTIES THAT THE COVERED SOFTWARE IS FREE OF DEFECTS, MERCHANTABLE, FIT FOR A PARTICULAR PURPOSE OR NON-INFRINGING. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE COVERED SOFTWARE IS WITH YOU. SHOULD ANY COVERED SOFTWARE PROVE DEFECTIVE IN ANY RESPECT, YOU (NOT THE INITIAL DEVELOPER OR ANY OTHER CONTRIBUTOR) ASSUME THE COST OF ANY NECESSARY SERVICING, REPAIR OR CORRECTION. THIS DISCLAIMER OF WARRANTY CONSTITUTES AN ESSENTIAL PART OF THIS LICENSE. NO USE OF ANY COVERED SOFTWARE IS AUTHORIZED HEREUNDER EXCEPT UNDER THIS DISCLAIMER. 6. TERMINATION. 6.1. This License and the rights granted hereunder will terminate automatically if You fail to comply with terms herein and fail to cure such breach within 30 days of becoming aware of the breach. Provisions which, by their nature, must remain in effect beyond the termination of this License shall survive. 6.2. If You assert a patent infringement claim (excluding declaratory judgment actions) against Initial Developer or a Contributor (the Initial Developer or Contributor against whom You assert such claim is referred to as "Participant") alleging that the Participant Software (meaning the Contributor Version where the Participant is a Contributor or the Original Software where the Participant is the Initial Developer) directly or indirectly infringes any patent, then any and all rights granted directly or indirectly to You by such Participant, the Initial Developer (if the Initial Developer is not the Participant) and all Contributors under Sections 2.1 and/or 2.2 of this License shall, upon 60 days notice from Participant terminate prospectively and automatically at the expiration of such 60 day notice period, unless if within such 60 day period You withdraw Your claim with respect to the Participant Software against such Participant either unilaterally or pursuant to a written agreement with Participant. 6.3. If You assert a patent infringement claim against Participant alleging that the Participant Software directly or indirectly infringes any patent where such claim is resolved (such as by license or settlement) prior to the initiation of patent infringement litigation, then the reasonable value of the licenses granted by such Participant under Sections 2.1 or 2.2 shall be taken into account in determining the amount or value of any payment or license. 6.4. In the event of termination under Sections 6.1 or 6.2 above, all end user licenses that have been validly granted by You or any distributor hereunder prior to termination (excluding licenses granted to You by any distributor) shall survive termination. 7. LIMITATION OF LIABILITY. UNDER NO CIRCUMSTANCES AND UNDER NO LEGAL THEORY, WHETHER TORT (INCLUDING NEGLIGENCE), CONTRACT, OR OTHERWISE, SHALL YOU, THE INITIAL DEVELOPER, ANY OTHER CONTRIBUTOR, OR ANY DISTRIBUTOR OF COVERED SOFTWARE, OR ANY SUPPLIER OF ANY OF SUCH PARTIES, BE LIABLE TO ANY PERSON FOR ANY INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY CHARACTER INCLUDING, WITHOUT LIMITATION, DAMAGES FOR LOSS OF GOODWILL, WORK STOPPAGE, COMPUTER FAILURE OR MALFUNCTION, OR ANY AND ALL OTHER COMMERCIAL DAMAGES OR LOSSES, EVEN IF SUCH PARTY SHALL HAVE BEEN INFORMED OF THE POSSIBILITY OF SUCH DAMAGES. THIS LIMITATION OF LIABILITY SHALL NOT APPLY TO LIABILITY FOR DEATH OR PERSONAL INJURY RESULTING FROM SUCH PARTY'S NEGLIGENCE TO THE EXTENT APPLICABLE LAW PROHIBITS SUCH LIMITATION. SOME JURISDICTIONS DO NOT ALLOW THE EXCLUSION OR LIMITATION OF INCIDENTAL OR CONSEQUENTIAL DAMAGES, SO THIS EXCLUSION AND LIMITATION MAY NOT APPLY TO YOU. 8. U.S. GOVERNMENT END USERS. The Covered Software is a "commercial item," as that term is defined in 48 C.F.R. 2.101 (Oct. 1995), consisting of "commercial computer software" (as that term is defined at 48 C.F.R. § 252.227-7014(a)(1)) and "commercial computer software documentation" as such terms are used in 48 C.F.R. 12.212 Sept. 1995). Consistent with 48 C.F.R. 12.212 and 48 C.F.R. 227.7202-1 through 227.7202-4 (June 1995), all U.S. Government End Users acquire Covered Software with only those rights set forth herein. This U.S. Government Rights clause is in lieu of, and supersedes, any other FAR, DFAR, or other clause or provision that addresses Government rights in computer software under this License. 9. MISCELLANEOUS. This License represents the complete agreement concerning subject matter hereof. If any provision of this License is held to be unenforceable, such provision shall be reformed only to the extent necessary to make it enforceable. This License shall be governed by the law of the jurisdiction specified in a notice contained within the Original Software (except to the extent applicable law, if any, provides otherwise), excluding such jurisdiction's conflict-of-law provisions. Any litigation relating to this License shall be subject to the jurisdiction of the courts located in the jurisdiction and venue specified in a notice contained within the Original Software, with the losing party responsible for costs, including, without limitation, court costs and reasonable attorneys' fees and expenses. The application of the United Nations Convention on Contracts for the International Sale of Goods is expressly excluded. Any law or regulation which provides that the language of a contract shall be construed against the drafter shall not apply to this License. You agree that You alone are responsible for compliance with the United States export administration regulations (and the export control laws and regulation of any other countries) when You use, distribute or otherwise make available any Covered Software. 10. RESPONSIBILITY FOR CLAIMS. As between Initial Developer and the Contributors, each party is responsible for claims and damages arising, directly or indirectly, out of its utilization of rights under this License and You agree to work with Initial Developer and Contributors to distribute such responsibility on an equitable basis. Nothing herein is intended or shall be deemed to constitute any admission of liability. -------- NOTICE PURSUANT TO SECTION 9 OF THE COMMON DEVELOPMENT AND DISTRIBUTION LICENSE (CDDL) The OpenSolaris code released under the CDDL shall be governed by the laws of the State of California (excluding conflict-of-law provisions). Any litigation relating to this License shall be subject to the jurisdiction of the Federal Courts of the Northern District of California and the state courts of the State of California, with venue lying in Santa Clara County, California. debian/rules0000755000000000000000000000313511734462123010253 0ustar #!/usr/bin/make -f # This has to be exported to make some magic below work. export DH_OPTIONS build: build-stamp build-stamp: build-arch build-indep touch build-stamp build-arch: build-nongtk build-gtk build-indep: build-doc build-nongtk: dh_testdir [ ! -f Makefile ] || $(MAKE) clean # radare non gtk edition ./configure --prefix=/usr --includedir=/usr/include \ --mandir=/usr/share/man --sysconfdir=/etc \ --libexecdir=/usr/lib --without-gui --without-nonfree $(MAKE) # get non gtk radare binary cp src/radare radare build-gtk: dh_testdir [ ! -f Makefile ] || $(MAKE) clean # radare gtk edition ./configure --prefix=/usr --includedir=/usr/include \ --mandir=/usr/share/man --sysconfdir=/etc \ --libexecdir=/usr/lib --without-nonfree $(MAKE) build-doc: # build documentary (radare book) cd doc && $(MAKE) && cd .. clean: dh_testdir dh_testroot rm -f build-stamp # clean up build process [ ! -f Makefile ] || $(MAKE) clean rm -f radare dh_clean install: build-arch dh_testdir dh_testroot dh_prep dh_installdirs $(MAKE) install DESTDIR=$(CURDIR)/debian/tmp binary-common: dh_testdir dh_testroot dh_install dh_installdocs dh_installmenu dh_installman dh_installchangelogs dh_python2 -V $(shell pyversions -vd) dh_strip dh_compress dh_fixperms dh_installdeb dh_shlibdeps dh_gencontrol dh_md5sums dh_builddeb binary-indep: build-indep $(MAKE) -f debian/rules DH_OPTIONS=-i binary-common binary-arch: build-arch install $(MAKE) -f debian/rules DH_OPTIONS=-s binary-common binary: binary-indep binary-arch .PHONY: build clean binary-indep binary-arch binary install debian/patches/0000755000000000000000000000000011734462123010620 5ustar debian/patches/fix-buffer-overflow.dpatch0000644000000000000000000000114011734462123015677 0ustar Description: Fix buffer overflow in gui/main.c This patch fixes a buffer overflow in the GUI; more precisely gui/main.c:gradare_save_project call to snprintf Author: SevenMachines Last-Update: 2010-08-22 --- radare-1.5.2.orig/gui/main.c 2010-08-21 08:32:07.000000000 +0100 +++ radare-1.5.2/gui/main.c 2010-08-21 08:34:27.000000000 +0100 @@ -243,7 +243,7 @@ void gradare_save_project() { - char buf[1024]; + char buf[4096]; if (project_file) { snprintf(buf, 4095, ":Ps %s\n\n", project_file); vte_terminal_feed_child(VTE_TERMINAL(term), buf, strlen(buf)); debian/patches/python-versions.dpatch0000644000000000000000000000132111734462123015171 0ustar Description: Allow building with only python2.6 being available This patch allows to build and install successfully if only one of the python 2.5 or 2.6 versions is installed. Author: SevenMachines Last-Update: 2010-08-22 --- radare-1.5.2.orig/src/plug/hack/Makefile 2010-08-21 08:42:36.000000000 +0100 +++ radare-1.5.2/src/plug/hack/Makefile 2010-08-21 08:42:36.000000000 +0100 @@ -65,7 +65,7 @@ # Try with -llua and -llua5.1 (stupid ubuntu) lua.${SO}: ifeq ($(HAVE_LANG_LUA),1) -ifeq ($(HAVE_LIB_PYTHON2_5),1) +ifneq (,$(filter 1,$(HAVE_LIB_PYTHON2_5) $(HAVE_LIB_PYTHON2_6))) ifneq ($(LUA_LIBS),) -${CC} lua.c ${SHARED_CFLAGS} ${CFLAGS} ${LUA_CFLAGS} ${LUA_LIBS} -o lua.so endif debian/patches/compile-bindings.dpatch0000644000000000000000000000233111734462123015227 0ustar #! /bin/sh /usr/share/dpatch/dpatch-run ## compile-bindings.dpatch by ## ## All lines beginning with `## DP:' are a description of the patch. ## DP: compile python and lua bindings correctly @DPATCH@ diff -urNad radare-1.4~/src/plug/hack/Makefile radare-1.4/src/plug/hack/Makefile --- radare-1.4~/src/plug/hack/Makefile 2009-06-03 22:37:05.000000000 +0200 +++ radare-1.4/src/plug/hack/Makefile 2009-06-03 22:44:16.000000000 +0200 @@ -4,12 +4,12 @@ #-${CC} perl.c ${CFLAGS}˘${HARED_CFLAGS}${PERL_CFLAGS} ${PERL_LDFLAGS} -o perl.${SHARED_EXT} PERL_CFLAGS=`perl -MExtUtils::Embed -e ccopts` PERL_LIBS=`perl -MExtUtils::Embed -e ldopts` -lncurses -PY_CFLAGS=-I${PREFIX}/include/python2.5/ -I/usr/include/python2.5/ -PY_LIBS=-lpython2.5 -PY26_CFLAGS=-I${PREFIX}/include/python2.6/ -I/usr/include/python2.6/ -PY26_LIBS=-lpython2.6 -LUA_CFLAGS=-I${PREFIX}/include/lua5.1/ -I/usr/include/lua5.1/ -LUA_LIBS= +PY_CFLAGS=`python2.5-config --cflags` +PY_LIBS=`python2.5-config --libs` +PY26_CFLAGS=`python2.6-config --cflags` +PY26_LIBS=`python2.6-config --libs` +LUA_CFLAGS=`pkg-config --cflags lua5.1` +LUA_LIBS=`pkg-config --libs lua5.1` RUBY_CFLAGS=-I/usr/lib/ruby/1.8/i386-linux RUBY_LIBS=-lruby18 SO=${SHARED_EXT} debian/patches/fix-glib-includes.patch0000644000000000000000000000176611734462123015160 0ustar Description: fix glib includes for glib 2.32+ Including invidiual glib headers has been deprecated for several years. . Starting with glib 2.32 it is now mandatory to include glib.h instead of individual headers [1], or the compiler will generate an error. Author: Sebastian Reichel Bug-Debian: http://bugs.debian.org/665604 Last-Update: 2012-03-28 diff --git a/gui/dialog.c b/gui/dialog.c index 6b3b681..d9b16a9 100644 --- a/gui/dialog.c +++ b/gui/dialog.c @@ -21,8 +21,7 @@ #include "main.h" #include #include -#include -#include +#include int dialog_yes_no(const char *question, int two) { diff --git a/gui/toolbar.c b/gui/toolbar.c index 3402a9b..f6af0b6 100644 --- a/gui/toolbar.c +++ b/gui/toolbar.c @@ -24,8 +24,7 @@ #include #include #include -#include -#include +#include #include #include #include debian/patches/fix-linking.dpatch0000644000000000000000000000113011734462123014217 0ustar # We need to link against libdl in Debian for kfreebsd and hurd --- a/configure 2010-05-21 11:56:54.000000000 +0200 +++ b/configure 2010-07-18 01:07:12.000000000 +0200 @@ -353,20 +353,10 @@ fi printf "checking for dynamic library... " -HAVE_LIB_DL=0 -for OS in linux syllable sunos darwin beos solaris ; do -if [ "${HOST_OS}" = "${OS}" ]; then - HAVE_LIB_DL=1 - break; -fi -done -if [ "${HAVE_LIB_DL}" = 1 ]; then - DL_LIBS="-ldl" - echo "required" -else - DL_LIBS="" - echo "libc" -fi +HAVE_LIB_DL=1 +DL_LIBS="-ldl" +echo "required" + SOLARIS="0" DARWIN="0" if [ "$HOST_OS" = "sunos" ]; then debian/patches/extract-waf0000644000000000000000000121042311734462123012773 0ustar Description: Unpack the embedded waf binary tarball Author: Reinhard Tartler Bug-Debian: http://bugs.debian.org/654497 --- /dev/null +++ radare-1.5.2/wafadmin/__init__.py @@ -0,0 +1,4 @@ +#! /usr/bin/env python +# encoding: utf-8 + + --- /dev/null +++ radare-1.5.2/wafadmin/Options.py @@ -0,0 +1,123 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,imp,types,tempfile +from optparse import OptionParser +import Logs,Utils +from Constants import* +options={} +commands={} +launch_dir='' +tooldir='' +lockfile=os.environ.get('WAFLOCK','.lock-wscript') +try:cache_global=os.path.abspath(os.environ['WAFCACHE']) +except KeyError:cache_global='' +platform=sys.platform +conf_file='conf-runs-%s-%d.pickle'%(sys.platform,ABI) +is_install=False +default_prefix=os.environ.get('PREFIX') +if not default_prefix: + if sys.platform=='win32':default_prefix=tempfile.gettempdir() + else:default_prefix='/usr/local/' +default_jobs=os.environ.get('JOBS',-1) +if default_jobs<1: + try: + default_jobs=os.sysconf('SC_NPROCESSORS_ONLN') + except: + default_jobs=1 +default_destdir=os.environ.get('DESTDIR','') +def create_parser(): + Logs.debug('options: create_parser is called') + parser=OptionParser(conflict_handler="resolve",usage="""waf [options] [commands ...] + +* Main commands: configure build install clean dist distclean uninstall distcheck +* Example: ./waf build -j4""",version='waf %s'%WAFVERSION) + parser.formatter.width=Utils.get_term_cols() + p=parser.add_option + p('-j','--jobs',type='int',default=default_jobs,help="amount of parallel jobs [Default: %s]"%default_jobs,dest='jobs') + p('-f','--force',action='store_true',default=False,help='force file installation',dest='force') + p('-k','--keep',action='store_true',default=False,help='keep running happily on independant task groups',dest='keep') + p('-p','--progress',action='count',default=0,help='-p: progress bar; -pp: ide output',dest='progress_bar') + p('-v','--verbose',action='count',default=0,help='verbosity level -v -vv or -vvv [Default: 0]',dest='verbose') + p('--destdir',help="installation root [Default: '%s']"%default_destdir,default=default_destdir,dest='destdir') + p('--nocache',action='store_true',default=False,help='compile everything, even if WAFCACHE is set',dest='nocache') + if'configure'in sys.argv: + p('-b','--blddir',action='store',default='',help='build dir for the project (configuration)',dest='blddir') + p('-s','--srcdir',action='store',default='',help='src dir for the project (configuration)',dest='srcdir') + p('--prefix',help="installation prefix (configuration only) [Default: '%s']"%default_prefix,default=default_prefix,dest='prefix') + p('--zones',action='store',default='',help='debugging zones (task_gen, deps, tasks, etc)',dest='zones') + p('--targets',action='store',default='',help='compile the targets given only [targets in CSV format, e.g. "target1,target2"]',dest='compile_targets') + return parser +def parse_args_impl(parser,_args=None): + global options,commands + (options,args)=parser.parse_args(args=_args) + lst='dist configure clean distclean build install uninstall check distcheck'.split() + commands={} + for var in lst:commands[var]=0 + if len(args)==0:commands['build']=1 + for arg in args: + arg=arg.strip() + if arg in lst: + commands[arg]=True + else: + print'Error: Invalid command specified ',arg + parser.print_help() + sys.exit(1) + if commands['check']: + commands['build']=True + if commands['install']or commands['uninstall']: + global is_install + is_install=True + if options.keep:options.jobs=1 + Logs.verbose=options.verbose + Logs.init_log() + if options.zones: + Logs.zones=options.zones.split(',') + if not Logs.verbose:Logs.verbose=1 +class Handler(object): + parser=None + def __init__(self): + self.parser=create_parser() + self.cwd=os.getcwd() + Handler.parser=self + def add_option(self,*kw,**kwargs): + self.parser.add_option(*kw,**kwargs) + def add_option_group(self,*args,**kwargs): + return self.parser.add_option_group(*args,**kwargs) + def get_option_group(self,opt_str): + return self.parser.get_option_group(opt_str) + def sub_options(self,dir,option_group=None): + try: + current=self.cwd + self.cwd=os.path.join(self.cwd,dir) + cur=os.path.join(self.cwd,WSCRIPT_FILE) + mod=Utils.load_module(cur) + if not hasattr(mod,'set_options'): + msg="the module %s has no set_options function;\n* make sure such a function is defined\n* run configure from the root of the project"%cur + raise Utils.WscriptError(msg) + else: + fun=mod.set_options + fun(option_group or self) + finally: + self.cwd=current + def tool_options(self,tool,tdir=None,option_group=None): + + if type(tool)is types.ListType: + for i in tool:self.tool_options(i,tdir,option_group) + return + if not tdir:tdir=tooldir + tdir=Utils.to_list(tdir) + try: + file,name,desc=imp.find_module(tool,tdir) + except ImportError: + raise Utils.WscriptError("no tool named '%s' found"%tool) + module=imp.load_module(tool,file,name,desc) + try: + fun=module.set_options + except AttributeError: + pass + else: + fun(option_group or self) + def parse_args(self,args=None): + parse_args_impl(self.parser,args) + --- /dev/null +++ radare-1.5.2/wafadmin/Scripting.py @@ -0,0 +1,338 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,shutil,traceback,time +import Utils,Configure,Build,Logs,Options,Environment +from Logs import error,warn,info +from Constants import* +g_gz='bz2' +g_dirwatch=None +g_excludes='.svn CVS .arch-ids {arch} SCCS BitKeeper .hg'.split() +g_dist_exts='~ .rej .orig .pyc .pyo .bak config.log .tar.bz2 .zip Makefile Makefile.in'.split() +g_distclean_exts='~ .pyc .wafpickle'.split() +def add_subdir(dir,bld): + try:bld.rescan(bld.path) + except OSError:raise Utils.WscriptError("No such directory "+bld.path.abspath()) + old=bld.path + new=bld.path.find_dir(dir) + if new is None: + raise Utils.WscriptError("subdir not found (%s), restore is %s"%(dir,bld.path)) + bld.path=new + try: + file_path=os.path.join(new.abspath(),WSCRIPT_BUILD_FILE) + file=open(file_path,'r') + exec file + if file:file.close() + except IOError: + file_path=os.path.join(new.abspath(),WSCRIPT_FILE) + module=Utils.load_module(file_path) + module.build(bld) + bld.path=old +def configure(): + jobs_save=Options.options.jobs + Options.options.jobs=1 + tree=Build.BuildContext() + err='The %s is not given in %s:\n * define a top level attribute named "%s"\n * run waf configure --%s=xxx' + src=getattr(Options.options,SRCDIR,None) + if not src:src=getattr(Utils.g_module,SRCDIR,None) + if not src:raise Utils.WscriptError(err%(SRCDIR,os.path.abspath('.'),SRCDIR,SRCDIR)) + src=os.path.abspath(src) + bld=getattr(Options.options,BLDDIR,None) + if not bld:bld=getattr(Utils.g_module,BLDDIR,None) + if not bld:raise Utils.WscriptError(err%(BLDDIR,os.path.abspath('.'),BLDDIR,BLDDIR)) + bld=os.path.abspath(bld) + tree.load_dirs(src,bld) + conf=Configure.ConfigurationContext(srcdir=src,blddir=bld) + conf.sub_config('') + conf.store(tree) + env=Environment.Environment() + env[BLDDIR]=bld + env[SRCDIR]=src + env['argv']=sys.argv + env['hash']=conf.hash + env['files']=conf.files + env.store(Options.lockfile) + Options.options.jobs=jobs_save +def read_cache_file(filename): + env=Environment.Environment() + env.load(filename) + return env +def prepare_impl(t,cwd,ver,wafdir): + Options.tooldir=[t] + Options.launch_dir=cwd + if'--version'in sys.argv: + opt_obj=Options.Handler() + opt_obj.parse_args() + sys.exit(0) + msg1='Waf: *** Nothing to do! Please run waf from a directory containing a file named "%s"'%WSCRIPT_FILE + build_dir_override=None + candidate=None + cwd=Options.launch_dir + lst=os.listdir(cwd) + search_for_candidate=True + if WSCRIPT_FILE in lst: + candidate=cwd + elif'configure'in sys.argv and not WSCRIPT_BUILD_FILE in lst: + calldir=os.path.abspath(os.path.dirname(sys.argv[0])) + if WSCRIPT_FILE in os.listdir(calldir): + candidate=calldir + search_for_candidate=False + else: + error("arg[0] directory does not contain a wscript file") + sys.exit(1) + build_dir_override=cwd + while search_for_candidate: + if len(cwd)<=3: + break + dirlst=os.listdir(cwd) + if WSCRIPT_FILE in dirlst: + candidate=cwd + if'configure'in sys.argv and candidate: + break + if Options.lockfile in dirlst: + break + cwd=cwd[:cwd.rfind(os.sep)] + if not candidate: + if'-h'in sys.argv or'--help'in sys.argv: + warn('No wscript file found: the help message may be incomplete') + opt_obj=Options.Handler() + opt_obj.parse_args() + else: + error(msg1) + sys.exit(0) + os.chdir(candidate) + Utils.set_main_module(os.path.join(candidate,WSCRIPT_FILE)) + if build_dir_override: + d=getattr(Utils.g_module,BLDDIR,None) + if d: + msg=' Overriding build directory %s with %s'%(d,build_dir_override) + warn(msg) + Utils.g_module.blddir=build_dir_override + opt_obj=Options.Handler() + opt_obj.sub_options('') + opt_obj.parse_args() + fun=getattr(Utils.g_module,'init',None) + if fun:fun() + for x in['dist','distclean','distcheck']: + if Options.commands[x]: + fun=getattr(Utils.g_module,x,None) + if fun: + fun() + else: + eval(x+'()') + sys.exit(0) + main() +def prepare(t,cwd,ver,wafdir): + if WAFVERSION!=ver: + msg='Version mismatch: waf %s <> wafadmin %s (wafdir %s)'%(ver,WAFVERSION,wafdir) + print'\033[91mError: %s\033[0m'%msg + sys.exit(1) + try: + prepare_impl(t,cwd,ver,wafdir) + except Utils.WafError,e: + error(e) + sys.exit(1) +def main(): + if Options.commands['configure']: + ini=time.time() + configure() + ela='' + if not Options.options.progress_bar:ela=time.strftime(' (%H:%M:%S)',time.gmtime(time.time()-ini)) + info('Configuration finished successfully%s; project is now ready to build.'%ela) + sys.exit(0) + bld=Build.BuildContext() + try: + proj=read_cache_file(Options.lockfile) + except IOError: + if Options.commands['clean']: + raise Utils.WafError("Nothing to clean (project not configured)") + else: + if Configure.autoconfig: + warn("Reconfiguring the project") + configure() + bld=Build.BuildContext() + proj=read_cache_file(Options.lockfile) + else: + raise Utils.WafError("Project not configured (run 'waf configure' first)") + if Configure.autoconfig: + if not Options.commands['clean']and not Options.commands['uninstall']: + reconf=0 + hash=0 + try: + for file in proj['files']: + mod=Utils.load_module(file) + hash=Utils.hash_function_with_globals(hash,mod.configure) + reconf=(hash!=proj['hash']) + except Exception,ex: + if Logs.verbose: + traceback.print_exc() + warn("Reconfiguring the project (an exception occurred: %s)"%(str(ex),)) + reconf=1 + if reconf: + warn("Reconfiguring the project (the configuration has changed)") + back=(Options.commands,Options.options,Logs.zones,Logs.verbose) + oldargs=sys.argv + sys.argv=proj['argv'] + opt_obj=Options.Handler() + opt_obj.sub_options('') + opt_obj.parse_args(args=sys.argv[1:]) + configure() + sys.argv=oldargs + (Options.commands,Options.options,Logs.zones,Logs.verbose)=back + bld=Build.BuildContext() + proj=read_cache_file(Options.lockfile) + bld.load_dirs(proj[SRCDIR],proj[BLDDIR]) + bld.load_envs() + f=getattr(Utils.g_module,'build',None) + if f: + f(bld) + else: + main_wscript=None + for(file_path,module)in Utils.g_loaded_modules.items(): + if module.__name__=='wscript_main': + main_wscript=file_path + break + raise Utils.WscriptError("Could not find the function 'def build(bld).'",main_wscript) + pre_build=getattr(Utils.g_module,'pre_build',None) + if pre_build:pre_build() + if Options.commands['build']or Options.is_install: + if Options.commands['uninstall']: + import Task + def runnable_status(self): + return SKIP_ME + setattr(Task.Task,'runnable_status',runnable_status) + ini=time.time() + bld.compile() + if Options.options.progress_bar:print'' + if Options.is_install: + bld.install() + ela='' + if not Options.options.progress_bar: + ela=time.strftime(' (%H:%M:%S)',time.gmtime(time.time()-ini)) + if Options.commands['install']:msg='Compilation and installation finished successfully%s'%ela + elif Options.commands['uninstall']:msg='Uninstallation finished successfully%s'%ela + else:msg='Compilation finished successfully%s'%ela + info(msg) + if Options.commands['clean']: + try: + bld.clean() + info('Cleaning finished successfully') + finally: + bld.save() + fun=getattr(Utils.g_module,'shutdown',None) + if fun:fun() +def copytree(src,dst,symlinks=False,excludes=(),build_dir=None): + names=os.listdir(src) + os.makedirs(dst) + errors=[] + for name in names: + srcname=os.path.join(src,name) + dstname=os.path.join(dst,name) + try: + if symlinks and os.path.islink(srcname): + linkto=os.readlink(srcname) + os.symlink(linkto,dstname) + elif os.path.isdir(srcname): + if name in excludes: + continue + elif name.startswith('.')or name.startswith(',,')or name.startswith('++'): + continue + elif name==build_dir: + continue + else: + copytree(srcname,dstname,symlinks,excludes) + else: + ends=name.endswith + to_remove=False + if name.startswith('.')or name.startswith('++'): + to_remove=True + else: + for x in g_dist_exts: + if ends(x): + to_remove=True + break + if not to_remove: + shutil.copy2(srcname,dstname) + except(IOError,os.error),why: + errors.append((srcname,dstname,str(why))) + except shutil.Error,err: + errors.extend(err.args[0]) + try: + shutil.copystat(src,dst) + except WindowsError: + pass + except OSError,why: + errors.extend((src,dst,str(why))) + if errors: + raise shutil.Error,errors +def distclean(): + for(root,dirs,filenames)in os.walk('.'): + for f in list(filenames): + to_remove=0 + if f==Options.lockfile: + to_remove=True + try: + proj=read_cache_file(os.path.join(root,f)) + shutil.rmtree(os.path.join(root,proj[BLDDIR])) + except(OSError,IOError): + pass + else: + ends=f.endswith + for x in g_distclean_exts: + if ends(x): + to_remove=1 + break + if to_remove: + os.remove(os.path.join(root,f)) + lst=os.listdir('.') + for f in lst: + if f.startswith('.waf-'): + shutil.rmtree(f,ignore_errors=True) + info('distclean finished successfully') +def dist(appname='',version=''): + import tarfile + if not appname:appname=getattr(Utils.g_module,APPNAME,'noname') + if not version:version=getattr(Utils.g_module,VERSION,'1.0') + TMPFOLDER=appname+'-'+version + if os.path.exists(TMPFOLDER):shutil.rmtree(TMPFOLDER) + global g_dist_exts,g_excludes + build_dir=getattr(Utils.g_module,BLDDIR,None) + copytree('.',TMPFOLDER,excludes=g_excludes,build_dir=build_dir) + dist_hook=getattr(Utils.g_module,'dist_hook',None) + if dist_hook: + os.chdir(TMPFOLDER) + try: + dist_hook() + finally: + os.chdir('..') + tar=tarfile.open(TMPFOLDER+'.tar.'+g_gz,'w:'+g_gz) + tar.add(TMPFOLDER) + tar.close() + info('Your archive is ready -> %s.tar.%s'%(TMPFOLDER,g_gz)) + if os.path.exists(TMPFOLDER):shutil.rmtree(TMPFOLDER) + return(TMPFOLDER,TMPFOLDER+'.tar.'+g_gz) +def distcheck(appname='',version=''): + import tempfile,tarfile + import pproc + if not appname:appname=getattr(Utils.g_module,APPNAME,'noname') + if not version:version=getattr(Utils.g_module,VERSION,'1.0') + waf=os.path.abspath(sys.argv[0]) + distdir,tarball=dist(appname,version) + t=tarfile.open(tarball) + for x in t:t.extract(x) + t.close() + instdir=tempfile.mkdtemp('.inst','%s-%s'%(appname,version)) + cwd_before=os.getcwd() + os.chdir(distdir) + try: + retval=pproc.Popen('%(waf)s configure && %(waf)s ''&& %(waf)s check && %(waf)s install --destdir=%(instdir)s'' && %(waf)s uninstall --destdir=%(instdir)s'%vars(),shell=True).wait() + if retval: + raise Utils.WafError('distcheck failed with code %i'%(retval)) + finally: + os.chdir(cwd_before) + shutil.rmtree(distdir) + if os.path.exists(instdir): + raise Utils.WafError("distcheck succeeded, but files were left in %s"%(instdir)) + else: + info('distcheck finished successfully') + --- /dev/null +++ radare-1.5.2/wafadmin/Task.py @@ -0,0 +1,525 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import os,types,shutil,sys,re,new,random,time +from Utils import md5 +import Build,Runner,Utils,Node,Logs,Options +from Logs import debug,error,warn +from Constants import* +algotype=NORMAL +algotype=JOBCONTROL +algotype=MAXPARALLEL +class TaskManager(object): + def __init__(self): + self.groups=[] + self.tasks_done=[] + self.current_group=0 + def get_next_set(self): + ret=None + while not ret and self.current_group0: + self.set_order(keys[i],keys[j]) + elif val<0: + self.set_order(keys[j],keys[i]) + def tasks_in_parallel(self): + if not self.ready:self.prepare() + keys=self.cstr_groups.keys() + unconnected=[] + remainder=[] + for u in keys: + for k in self.cstr_order.values(): + if u in k: + remainder.append(u) + break + else: + unconnected.append(u) + toreturn=[] + for y in unconnected: + toreturn.extend(self.cstr_groups[y]) + for y in unconnected: + try:self.cstr_order.__delitem__(y) + except KeyError:pass + self.cstr_groups.__delitem__(y) + if not toreturn and remainder: + raise Utils.WafError("circular order constraint detected %r"%remainder) + return toreturn + def tasks_by_max_jobs(self): + if not self.ready:self.prepare() + if not self.temp_tasks:self.temp_tasks=self.tasks_in_parallel() + if not self.temp_tasks:return(None,None) + maxjobs=sys.maxint + ret=[] + remaining=[] + for t in self.temp_tasks: + m=getattr(t,"maxjobs",getattr(self.__class__,"maxjobs",sys.maxint)) + if m>maxjobs: + remaining.append(t) + elif m task failed (err #%d): %r"%(self.err_code,self) + except AttributeError: + return" -> task failed: %r"%self + elif self.hasrun==EXCEPTION: + return" -> task got an exception %r"%self.err_msg + elif self.hasrun==MISSING: + return" -> missing files: %r"%self + else: + return'' + def install(self): + bld=Build.bld + d=self.attr('install') + if self.attr('install_path'): + lst=[a.relpath_gen(bld.srcnode)for a in self.outputs] + perm=self.attr('chmod',0644) + if self.attr('src'): + lst+=[a.relpath_gen(bld.srcnode)for a in self.inputs] + if self.attr('filename'): + dir=self.install_path+self.attr('filename') + bld.install_as(self.install_path,lst[0],self.env,perm) + else: + bld.install_files(self.install_path,lst,self.env,perm) +class Task(TaskBase): + vars=[] + def __init__(self,env,normal=1): + TaskBase.__init__(self,normal=normal) + self.env=env + self.inputs=[] + self.outputs=[] + self.deps_nodes=[] + self.run_after=[] + def __str__(self): + env=self.env + src_str=' '.join([a.nice_path(env)for a in self.inputs]) + tgt_str=' '.join([a.nice_path(env)for a in self.outputs]) + return'%s: %s -> %s\n'%(self.__class__.__name__,src_str,tgt_str) + def __repr__(self): + return"".join(['\n\t{task: ',self.__class__.__name__," ",",".join([x.name for x in self.inputs])," -> ",",".join([x.name for x in self.outputs]),'}']) + def unique_id(self): + x=getattr(self,'uid',None) + if x:return x + m=md5() + up=m.update + up(self.env.variant()) + for x in self.inputs+self.outputs: + up(x.abspath()) + up(self.__class__.__name__) + up(Utils.h_fun(self.run)) + x=self.uid=m.digest() + return x + def set_inputs(self,inp): + if type(inp)is types.ListType:self.inputs+=inp + else:self.inputs.append(inp) + def set_outputs(self,out): + if type(out)is types.ListType:self.outputs+=out + else:self.outputs.append(out) + def set_run_after(self,task): + assert isinstance(task,TaskBase) + self.run_after.append(task) + def add_file_dependency(self,filename): + node=Build.bld.current.find_resource(filename) + self.deps_nodes.append(node) + def signature(self): + try:return self.sign_all + except AttributeError:pass + m=md5() + exp_sig=self.sig_explicit_deps() + m.update(exp_sig) + imp_sig=self.scan and self.sig_implicit_deps()or SIG_NIL + m.update(imp_sig) + var_sig=self.sig_vars() + m.update(var_sig) + ret=m.digest() + self.cache_sig=(ret,exp_sig,imp_sig,var_sig) + self.sign_all=ret + return ret + def runnable_status(self): + if self.inputs and(not self.outputs): + if not getattr(self.__class__,'quiet',None): + error("task is invalid : no inputs or outputs (override in a Task subclass?) %r"%self) + for t in self.run_after: + if not t.hasrun: + return ASK_LATER + env=self.env + tree=Build.bld + time=None + for node in self.outputs: + variant=node.variant(env) + try: + time=tree.node_sigs[variant][node.id] + except KeyError: + debug("task: task %r must run as the first node does not exist"%self) + time=None + break + if time is None: + try: + new_sig=self.signature() + except KeyError: + debug("something is wrong, computing the task signature failed") + return RUN_ME + return RUN_ME + key=self.unique_id() + try: + prev_sig=tree.task_sigs[key][0] + except KeyError: + debug("task: task %r must run as it was never run before or the task code changed"%self) + return RUN_ME + new_sig=self.signature() + if Logs.verbose:self.debug_why(tree.task_sigs[key]) + if new_sig!=prev_sig: + return RUN_ME + return SKIP_ME + def post_run(self): + tree=Build.bld + env=self.env + sig=self.signature() + cnt=0 + for node in self.outputs: + variant=node.variant(env) + os.stat(node.abspath(env)) + tree.node_sigs[variant][node.id]=sig + if Options.cache_global: + ssig=sig.encode('hex') + dest=os.path.join(Options.cache_global,ssig+'-'+str(cnt)) + try:shutil.copy2(node.abspath(env),dest) + except IOError:warn('Could not write the file to the cache') + cnt+=1 + tree.task_sigs[self.unique_id()]=self.cache_sig + self.executed=1 + def can_retrieve_cache(self): + if not Options.cache_global:return None + if Options.options.nocache:return None + env=self.env + sig=self.signature() + cnt=0 + for node in self.outputs: + variant=node.variant(env) + ssig=sig.encode('hex') + orig=os.path.join(Options.cache_global,ssig+'-'+str(cnt)) + try: + shutil.copy2(orig,node.abspath(env)) + os.utime(orig,None) + except(OSError,IOError): + debug('task: failed retrieving file') + return None + else: + cnt+=1 + Build.bld.node_sigs[variant][node.id]=sig + Runner.printout('restoring from cache %r\n'%node.bldpath(env)) + return 1 + def debug_why(self,old_sigs): + new_sigs=self.cache_sig + def v(x): + return x.encode('hex') + debug("Task %r"%self) + msgs=['Task must run','* Source file or manual dependency','* Implicit dependency','* Environment variable'] + tmp='task: -> %s: %s %s' + for x in xrange(len(msgs)): + if(new_sigs[x]!=old_sigs[x]): + debug(tmp%(msgs[x],v(old_sigs[x]),v(new_sigs[x]))) + def sig_explicit_deps(self): + tree=Build.bld + m=md5() + for x in self.inputs: + variant=x.variant(self.env) + m.update(tree.node_sigs[variant][x.id]) + for x in getattr(self,'dep_nodes',[]): + variant=x.variant(self.env) + v=tree.node_sigs[variant][x.id] + m.update(v) + try: + additional_deps=tree.deps_man + except AttributeError: + pass + else: + for x in self.inputs+self.outputs: + try: + d=additional_deps[x] + except KeyError: + continue + if callable(d):d=d() + m.update(d) + return m.digest() + def sig_vars(self): + m=md5() + tree=Build.bld + env=self.env + fun=getattr(self.__class__,'signature_hook',None) + if fun:act_sig=self.__class__.signature_hook(self) + else:act_sig=tree.hash_env_vars(env,self.__class__.vars) + m.update(act_sig) + var_sig=SIG_NIL + dep_vars=getattr(self,'dep_vars',None) + if dep_vars: + var_sig=tree.hash_env_vars(env,dep_vars) + m.update(var_sig) + for x in getattr(self.__class__,"vars",()): + k=env[x] + if k: + m.update(str(k)) + vars_sig=hash((var_sig,str(k))) + return m.digest() + scan=None + def sig_implicit_deps(self): + tree=Build.bld + key=self.unique_id() + prev_sigs=tree.task_sigs.get(key,()) + if prev_sigs and prev_sigs[2]==self.compute_sig_implicit_deps(): + return prev_sigs[2] + (nodes,names)=self.scan() + if Logs.verbose and Logs.zones: + debug('deps: scanner for %s returned %s %s'%(str(self),str(nodes),str(names))) + tree=Build.bld + tree.node_deps[self.unique_id()]=nodes + tree.raw_deps[self.unique_id()]=names + sig=self.compute_sig_implicit_deps() + return sig + def compute_sig_implicit_deps(self): + m=md5() + upd=m.update + tree=Build.bld + tstamp=tree.node_sigs + env=self.env + for k in tree.node_deps.get(self.unique_id(),()): + try:tree.cache_scanned_folders[k.parent.id] + except KeyError:tree.rescan(k.parent) + if k.id&3==Node.FILE:upd(tstamp[0][k.id]) + else:upd(tstamp[env.variant()][k.id]) + return m.digest() +def funex(c): + exec(c) + return f +reg_act=re.compile(r"(?P\$\$)|(?P\$\{(?P\w+)(?P.*?)\})",re.M) +def compile_fun(name,line): + extr=[] + def repl(match): + g=match.group + if g('dollar'):return"$" + elif g('subst'):extr.append((g('var'),g('code')));return"%s" + return None + line=reg_act.sub(repl,line) + parm=[] + dvars=[] + app=parm.append + for(var,meth)in extr: + if var=='SRC': + if meth:app('task.inputs%s'%meth) + else:app('" ".join([a.srcpath(env) for a in task.inputs])') + elif var=='TGT': + if meth:app('task.outputs%s'%meth) + else:app('" ".join([a.bldpath(env) for a in task.outputs])') + else: + if not var in dvars:dvars.append(var) + app("p('%s')"%var) + if parm:parm="%% (%s) "%(',\n\t\t'.join(parm)) + else:parm='' + c=''' +def f(task): + env = task.env + p = env.get_flat + cmd = "%s" %s + return Runner.exec_command(cmd) +'''%(line,parm) + debug('action: %s'%c) + return(funex(c),dvars) +def simple_task_type(name,line,color='GREEN',vars=[],ext_in=[],ext_out=[],before=[],after=[]): + (fun,dvars)=compile_fun(name,line) + fun.code=line + return task_type_from_func(name,fun,vars or dvars,color,ext_in,ext_out,before,after) +def task_type_from_func(name,func,vars=[],color='GREEN',ext_in=[],ext_out=[],before=[],after=[]): + params={'run':func,'vars':vars,'color':color,'name':name,'ext_in':Utils.to_list(ext_in),'ext_out':Utils.to_list(ext_out),'before':Utils.to_list(before),'after':Utils.to_list(after),} + cls=new.classobj(name,(Task,),params) + TaskBase.classes[name]=cls + return cls + --- /dev/null +++ radare-1.5.2/wafadmin/pproc.py @@ -0,0 +1,496 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import sys +mswindows=(sys.platform=="win32") +import os +import types +import traceback +import gc +class CalledProcessError(Exception): + def __init__(self,returncode,cmd): + self.returncode=returncode + self.cmd=cmd + def __str__(self): + return"Command '%s' returned non-zero exit status %d"%(self.cmd,self.returncode) +if mswindows: + import threading + import msvcrt + if 0: + import pywintypes + from win32api import GetStdHandle,STD_INPUT_HANDLE,STD_OUTPUT_HANDLE,STD_ERROR_HANDLE + from win32api import GetCurrentProcess,DuplicateHandle,GetModuleFileName,GetVersion + from win32con import DUPLICATE_SAME_ACCESS,SW_HIDE + from win32pipe import CreatePipe + from win32process import CreateProcess,STARTUPINFO,GetExitCodeProcess,STARTF_USESTDHANDLES,STARTF_USESHOWWINDOW,CREATE_NEW_CONSOLE + from win32event import WaitForSingleObject,INFINITE,WAIT_OBJECT_0 + else: + from _subprocess import* + class STARTUPINFO: + dwFlags=0 + hStdInput=None + hStdOutput=None + hStdError=None + wShowWindow=0 + class pywintypes: + error=IOError +else: + import select + import errno + import fcntl + import pickle +__all__=["Popen","PIPE","STDOUT","call","check_call","CalledProcessError"] +try: + MAXFD=os.sysconf("SC_OPEN_MAX") +except: + MAXFD=256 +try: + False +except NameError: + False=0 + True=1 +_active=[] +def _cleanup(): + for inst in _active[:]: + if inst.poll(_deadstate=sys.maxint)>=0: + try: + _active.remove(inst) + except ValueError: + pass +PIPE=-1 +STDOUT=-2 +def call(*popenargs,**kwargs): + return Popen(*popenargs,**kwargs).wait() +def check_call(*popenargs,**kwargs): + retcode=call(*popenargs,**kwargs) + cmd=kwargs.get("args") + if cmd is None: + cmd=popenargs[0] + if retcode: + raise CalledProcessError(retcode,cmd) + return retcode +def list2cmdline(seq): + result=[] + needquote=False + for arg in seq: + bs_buf=[] + if result: + result.append(' ') + needquote=(" "in arg)or("\t"in arg)or arg=="" + if needquote: + result.append('"') + for c in arg: + if c=='\\': + bs_buf.append(c) + elif c=='"': + result.append('\\'*len(bs_buf)*2) + bs_buf=[] + result.append('\\"') + else: + if bs_buf: + result.extend(bs_buf) + bs_buf=[] + result.append(c) + if bs_buf: + result.extend(bs_buf) + if needquote: + result.extend(bs_buf) + result.append('"') + return''.join(result) +class Popen(object): + def __init__(self,args,bufsize=0,executable=None,stdin=None,stdout=None,stderr=None,preexec_fn=None,close_fds=False,shell=False,cwd=None,env=None,universal_newlines=False,startupinfo=None,creationflags=0): + _cleanup() + self._child_created=False + if not isinstance(bufsize,(int,long)): + raise TypeError("bufsize must be an integer") + if mswindows: + if preexec_fn is not None: + raise ValueError("preexec_fn is not supported on Windows platforms") + if close_fds: + raise ValueError("close_fds is not supported on Windows platforms") + else: + if startupinfo is not None: + raise ValueError("startupinfo is only supported on Windows platforms") + if creationflags!=0: + raise ValueError("creationflags is only supported on Windows platforms") + self.stdin=None + self.stdout=None + self.stderr=None + self.pid=None + self.returncode=None + self.universal_newlines=universal_newlines + (p2cread,p2cwrite,c2pread,c2pwrite,errread,errwrite)=self._get_handles(stdin,stdout,stderr) + self._execute_child(args,executable,preexec_fn,close_fds,cwd,env,universal_newlines,startupinfo,creationflags,shell,p2cread,p2cwrite,c2pread,c2pwrite,errread,errwrite) + if mswindows: + if stdin is None and p2cwrite is not None: + os.close(p2cwrite) + p2cwrite=None + if stdout is None and c2pread is not None: + os.close(c2pread) + c2pread=None + if stderr is None and errread is not None: + os.close(errread) + errread=None + if p2cwrite: + self.stdin=os.fdopen(p2cwrite,'wb',bufsize) + if c2pread: + if universal_newlines: + self.stdout=os.fdopen(c2pread,'rU',bufsize) + else: + self.stdout=os.fdopen(c2pread,'rb',bufsize) + if errread: + if universal_newlines: + self.stderr=os.fdopen(errread,'rU',bufsize) + else: + self.stderr=os.fdopen(errread,'rb',bufsize) + def _translate_newlines(self,data): + data=data.replace("\r\n","\n") + data=data.replace("\r","\n") + return data + def __del__(self,sys=sys): + if not self._child_created: + return + self.poll(_deadstate=sys.maxint) + if self.returncode is None and _active is not None: + _active.append(self) + def communicate(self,input=None): + if[self.stdin,self.stdout,self.stderr].count(None)>=2: + stdout=None + stderr=None + if self.stdin: + if input: + self.stdin.write(input) + self.stdin.close() + elif self.stdout: + stdout=self.stdout.read() + elif self.stderr: + stderr=self.stderr.read() + self.wait() + return(stdout,stderr) + return self._communicate(input) + if mswindows: + def _get_handles(self,stdin,stdout,stderr): + if stdin is None and stdout is None and stderr is None: + return(None,None,None,None,None,None) + p2cread,p2cwrite=None,None + c2pread,c2pwrite=None,None + errread,errwrite=None,None + if stdin is None: + p2cread=GetStdHandle(STD_INPUT_HANDLE) + if p2cread is not None: + pass + elif stdin is None or stdin==PIPE: + p2cread,p2cwrite=CreatePipe(None,0) + p2cwrite=p2cwrite.Detach() + p2cwrite=msvcrt.open_osfhandle(p2cwrite,0) + elif isinstance(stdin,int): + p2cread=msvcrt.get_osfhandle(stdin) + else: + p2cread=msvcrt.get_osfhandle(stdin.fileno()) + p2cread=self._make_inheritable(p2cread) + if stdout is None: + c2pwrite=GetStdHandle(STD_OUTPUT_HANDLE) + if c2pwrite is not None: + pass + elif stdout is None or stdout==PIPE: + c2pread,c2pwrite=CreatePipe(None,0) + c2pread=c2pread.Detach() + c2pread=msvcrt.open_osfhandle(c2pread,0) + elif isinstance(stdout,int): + c2pwrite=msvcrt.get_osfhandle(stdout) + else: + c2pwrite=msvcrt.get_osfhandle(stdout.fileno()) + c2pwrite=self._make_inheritable(c2pwrite) + if stderr is None: + errwrite=GetStdHandle(STD_ERROR_HANDLE) + if errwrite is not None: + pass + elif stderr is None or stderr==PIPE: + errread,errwrite=CreatePipe(None,0) + errread=errread.Detach() + errread=msvcrt.open_osfhandle(errread,0) + elif stderr==STDOUT: + errwrite=c2pwrite + elif isinstance(stderr,int): + errwrite=msvcrt.get_osfhandle(stderr) + else: + errwrite=msvcrt.get_osfhandle(stderr.fileno()) + errwrite=self._make_inheritable(errwrite) + return(p2cread,p2cwrite,c2pread,c2pwrite,errread,errwrite) + def _make_inheritable(self,handle): + return DuplicateHandle(GetCurrentProcess(),handle,GetCurrentProcess(),0,1,DUPLICATE_SAME_ACCESS) + def _find_w9xpopen(self): + w9xpopen=os.path.join(os.path.dirname(GetModuleFileName(0)),"w9xpopen.exe") + if not os.path.exists(w9xpopen): + w9xpopen=os.path.join(os.path.dirname(sys.exec_prefix),"w9xpopen.exe") + if not os.path.exists(w9xpopen): + raise RuntimeError("Cannot locate w9xpopen.exe, which is needed for Popen to work with your shell or platform.") + return w9xpopen + def _execute_child(self,args,executable,preexec_fn,close_fds,cwd,env,universal_newlines,startupinfo,creationflags,shell,p2cread,p2cwrite,c2pread,c2pwrite,errread,errwrite): + if not isinstance(args,types.StringTypes): + args=list2cmdline(args) + if startupinfo is None: + startupinfo=STARTUPINFO() + if None not in(p2cread,c2pwrite,errwrite): + startupinfo.dwFlags|=STARTF_USESTDHANDLES + startupinfo.hStdInput=p2cread + startupinfo.hStdOutput=c2pwrite + startupinfo.hStdError=errwrite + if shell: + startupinfo.dwFlags|=STARTF_USESHOWWINDOW + startupinfo.wShowWindow=SW_HIDE + comspec=os.environ.get("COMSPEC","cmd.exe") + args=comspec+" /c "+args + if(GetVersion()>=0x80000000L or os.path.basename(comspec).lower()=="command.com"): + w9xpopen=self._find_w9xpopen() + args='"%s" %s'%(w9xpopen,args) + creationflags|=CREATE_NEW_CONSOLE + try: + hp,ht,pid,tid=CreateProcess(executable,args,None,None,1,creationflags,env,cwd,startupinfo) + except pywintypes.error,e: + raise WindowsError(*e.args) + self._child_created=True + self._handle=hp + self.pid=pid + ht.Close() + if p2cread is not None: + p2cread.Close() + if c2pwrite is not None: + c2pwrite.Close() + if errwrite is not None: + errwrite.Close() + def poll(self,_deadstate=None): + if self.returncode is None: + if WaitForSingleObject(self._handle,0)==WAIT_OBJECT_0: + self.returncode=GetExitCodeProcess(self._handle) + return self.returncode + def wait(self): + if self.returncode is None: + obj=WaitForSingleObject(self._handle,INFINITE) + self.returncode=GetExitCodeProcess(self._handle) + return self.returncode + def _readerthread(self,fh,buffer): + buffer.append(fh.read()) + def _communicate(self,input): + stdout=None + stderr=None + if self.stdout: + stdout=[] + stdout_thread=threading.Thread(target=self._readerthread,args=(self.stdout,stdout)) + stdout_thread.setDaemon(True) + stdout_thread.start() + if self.stderr: + stderr=[] + stderr_thread=threading.Thread(target=self._readerthread,args=(self.stderr,stderr)) + stderr_thread.setDaemon(True) + stderr_thread.start() + if self.stdin: + if input is not None: + self.stdin.write(input) + self.stdin.close() + if self.stdout: + stdout_thread.join() + if self.stderr: + stderr_thread.join() + if stdout is not None: + stdout=stdout[0] + if stderr is not None: + stderr=stderr[0] + if self.universal_newlines and hasattr(file,'newlines'): + if stdout: + stdout=self._translate_newlines(stdout) + if stderr: + stderr=self._translate_newlines(stderr) + self.wait() + return(stdout,stderr) + else: + def _get_handles(self,stdin,stdout,stderr): + p2cread,p2cwrite=None,None + c2pread,c2pwrite=None,None + errread,errwrite=None,None + if stdin is None: + pass + elif stdin==PIPE: + p2cread,p2cwrite=os.pipe() + elif isinstance(stdin,int): + p2cread=stdin + else: + p2cread=stdin.fileno() + if stdout is None: + pass + elif stdout==PIPE: + c2pread,c2pwrite=os.pipe() + elif isinstance(stdout,int): + c2pwrite=stdout + else: + c2pwrite=stdout.fileno() + if stderr is None: + pass + elif stderr==PIPE: + errread,errwrite=os.pipe() + elif stderr==STDOUT: + errwrite=c2pwrite + elif isinstance(stderr,int): + errwrite=stderr + else: + errwrite=stderr.fileno() + return(p2cread,p2cwrite,c2pread,c2pwrite,errread,errwrite) + def _set_cloexec_flag(self,fd): + try: + cloexec_flag=fcntl.FD_CLOEXEC + except AttributeError: + cloexec_flag=1 + old=fcntl.fcntl(fd,fcntl.F_GETFD) + fcntl.fcntl(fd,fcntl.F_SETFD,old|cloexec_flag) + def _close_fds(self,but): + for i in xrange(3,MAXFD): + if i==but: + continue + try: + os.close(i) + except: + pass + def _execute_child(self,args,executable,preexec_fn,close_fds,cwd,env,universal_newlines,startupinfo,creationflags,shell,p2cread,p2cwrite,c2pread,c2pwrite,errread,errwrite): + if isinstance(args,types.StringTypes): + args=[args] + else: + args=list(args) + if shell: + args=["/bin/sh","-c"]+args + if executable is None: + executable=args[0] + errpipe_read,errpipe_write=os.pipe() + self._set_cloexec_flag(errpipe_write) + gc_was_enabled=gc.isenabled() + gc.disable() + try: + self.pid=os.fork() + except: + if gc_was_enabled: + gc.enable() + raise + self._child_created=True + if self.pid==0: + try: + if p2cwrite: + os.close(p2cwrite) + if c2pread: + os.close(c2pread) + if errread: + os.close(errread) + os.close(errpipe_read) + if p2cread: + os.dup2(p2cread,0) + if c2pwrite: + os.dup2(c2pwrite,1) + if errwrite: + os.dup2(errwrite,2) + if p2cread and p2cread not in(0,): + os.close(p2cread) + if c2pwrite and c2pwrite not in(p2cread,1): + os.close(c2pwrite) + if errwrite and errwrite not in(p2cread,c2pwrite,2): + os.close(errwrite) + if close_fds: + self._close_fds(but=errpipe_write) + if cwd is not None: + os.chdir(cwd) + if preexec_fn: + apply(preexec_fn) + if env is None: + os.execvp(executable,args) + else: + os.execvpe(executable,args,env) + except: + exc_type,exc_value,tb=sys.exc_info() + exc_lines=traceback.format_exception(exc_type,exc_value,tb) + exc_value.child_traceback=''.join(exc_lines) + os.write(errpipe_write,pickle.dumps(exc_value)) + os._exit(255) + if gc_was_enabled: + gc.enable() + os.close(errpipe_write) + if p2cread and p2cwrite: + os.close(p2cread) + if c2pwrite and c2pread: + os.close(c2pwrite) + if errwrite and errread: + os.close(errwrite) + data=os.read(errpipe_read,1048576) + os.close(errpipe_read) + if data!="": + os.waitpid(self.pid,0) + child_exception=pickle.loads(data) + raise child_exception + def _handle_exitstatus(self,sts): + if os.WIFSIGNALED(sts): + self.returncode=-os.WTERMSIG(sts) + elif os.WIFEXITED(sts): + self.returncode=os.WEXITSTATUS(sts) + else: + raise RuntimeError("Unknown child exit status!") + def poll(self,_deadstate=None): + if self.returncode is None: + try: + pid,sts=os.waitpid(self.pid,os.WNOHANG) + if pid==self.pid: + self._handle_exitstatus(sts) + except os.error: + if _deadstate is not None: + self.returncode=_deadstate + return self.returncode + def wait(self): + if self.returncode is None: + pid,sts=os.waitpid(self.pid,0) + self._handle_exitstatus(sts) + return self.returncode + def _communicate(self,input): + read_set=[] + write_set=[] + stdout=None + stderr=None + if self.stdin: + self.stdin.flush() + if input: + write_set.append(self.stdin) + else: + self.stdin.close() + if self.stdout: + read_set.append(self.stdout) + stdout=[] + if self.stderr: + read_set.append(self.stderr) + stderr=[] + input_offset=0 + while read_set or write_set: + rlist,wlist,xlist=select.select(read_set,write_set,[]) + if self.stdin in wlist: + bytes_written=os.write(self.stdin.fileno(),buffer(input,input_offset,512)) + input_offset+=bytes_written + if input_offset>=len(input): + self.stdin.close() + write_set.remove(self.stdin) + if self.stdout in rlist: + data=os.read(self.stdout.fileno(),1024) + if data=="": + self.stdout.close() + read_set.remove(self.stdout) + stdout.append(data) + if self.stderr in rlist: + data=os.read(self.stderr.fileno(),1024) + if data=="": + self.stderr.close() + read_set.remove(self.stderr) + stderr.append(data) + if stdout is not None: + stdout=''.join(stdout) + if stderr is not None: + stderr=''.join(stderr) + if self.universal_newlines and hasattr(file,'newlines'): + if stdout: + stdout=self._translate_newlines(stdout) + if stderr: + stderr=self._translate_newlines(stderr) + self.wait() + return(stdout,stderr) + --- /dev/null +++ radare-1.5.2/wafadmin/Configure.py @@ -0,0 +1,177 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,types,imp,cPickle +import Environment,Utils,Options +from Logs import warn +from Constants import* +class ConfigurationError(Utils.WscriptError): + pass +autoconfig=False +def find_file(filename,path_list): + if type(path_list)is types.StringType: + lst=path_list.split() + else: + lst=path_list + for directory in lst: + if os.path.exists(os.path.join(directory,filename)): + return directory + return'' +def find_program_impl(env,filename,path_list=[],var=None): + try:path_list=path_list.split() + except AttributeError:pass + if var: + if var in os.environ:env[var]=os.environ[var] + if env[var]:return env[var] + if not path_list:path_list=os.environ['PATH'].split(os.pathsep) + ext=(Options.platform=='win32')and'.exe,.com,.bat,.cmd'or'' + for y in[filename+x for x in ext.split(',')]: + for directory in path_list: + x=os.path.join(directory,y) + if os.path.isfile(x): + if var:env[var]=x + return x + return'' +class ConfigurationContext(object): + tests={} + error_handlers=[] + def __init__(self,env=None,blddir='',srcdir=''): + self.env=None + self.envname='' + self.line_just=40 + self.blddir=blddir + self.srcdir=srcdir + self.cachedir=os.path.join(blddir,CACHE_DIR) + self.all_envs={} + self.defines={} + self.cwd=os.getcwd() + self.tools=[] + self.setenv(DEFAULT) + self.lastprog='' + self.hash=0 + self.files=[] + path=os.path.join(self.blddir,WAF_CONFIG_LOG) + try:os.unlink(path) + except(OSError,IOError):pass + self.log=open(path,'wb') + def __del__(self): + if hasattr(self,'log')and self.log: + self.log.close() + def fatal(self,msg): + raise ConfigurationError(msg) + def check_tool(self,input,tooldir=None,funs=None): + tools=Utils.to_list(input) + if tooldir:tooldir=Utils.to_list(tooldir) + for tool in tools: + try: + file,name,desc=imp.find_module(tool,tooldir) + except ImportError: + self.fatal("no tool named '%s' found."%tool) + module=imp.load_module(tool,file,name,desc) + func=getattr(module,'detect',None) + if func: + if type(func)is types.FunctionType:func(self) + else:self.eval_rules(funs or func) + self.tools.append({'tool':tool,'tooldir':tooldir,'funs':funs}) + def sub_config(self,dir): + current=self.cwd + self.cwd=os.path.join(self.cwd,dir) + cur=os.path.join(self.cwd,WSCRIPT_FILE) + mod=Utils.load_module(cur) + if not hasattr(mod,'configure'): + self.fatal('the module %s has no configure function; make sure such a function is defined'%cur) + ret=mod.configure(self) + global autoconfig + if autoconfig: + self.hash=Utils.hash_function_with_globals(self.hash,mod.configure) + self.files.append(os.path.abspath(cur)) + self.cwd=current + return ret + def store(self,file=''): + if not os.path.isdir(self.cachedir): + os.makedirs(self.cachedir) + file=open(os.path.join(self.cachedir,'build.config.py'),'w') + file.write('version = 0x%x\n'%HEXVERSION) + file.write('tools = %r\n'%self.tools) + file.close() + if not self.all_envs: + self.fatal('nothing to store in the configuration context!') + for key in self.all_envs: + tmpenv=self.all_envs[key] + tmpenv.store(os.path.join(self.cachedir,key+CACHE_SUFFIX)) + def set_env_name(self,name,env): + self.all_envs[name]=env + return env + def retrieve(self,name,fromenv=None): + try: + env=self.all_envs[name] + except KeyError: + env=Environment.Environment() + self.all_envs[name]=env + else: + if fromenv:warn("The environment %s may have been configured already"%name) + return env + def setenv(self,name): + self.env=self.retrieve(name) + self.envname=name + def add_os_flags(self,var,dest=None): + if not dest:dest=var + try:self.env[dest]=os.environ[var] + except KeyError:pass + def check_message(self,th,msg,state,option=''): + sr='Checking for %s %s'%(th,msg) + self.line_just=max(self.line_just,len(sr)) + print"%s :"%sr.ljust(self.line_just), + p=Utils.pprint + if state:p('GREEN','ok '+option) + else:p('YELLOW','not found') + self.log.write(sr+'\n\n') + def check_message_custom(self,th,msg,custom,option='',color='PINK'): + sr='Checking for %s %s'%(th,msg) + self.line_just=max(self.line_just,len(sr)) + print"%s :"%sr.ljust(self.line_just), + Utils.pprint(color,custom) + self.log.write(sr+'\n\n') + def find_program(self,filename,path_list=[],var=None): + ret=find_program_impl(self.env,filename,path_list,var) + self.check_message('program',filename,ret,ret) + return ret + def __getattr__(self,name): + if name and name.startswith('require_'): + r=self.__class__.__dict__.get(name,None) + if r:return r + for k in['check_','find_']: + n=name.replace('require_',k) + ret=self.__class__.__dict__.get(n,None) + if ret: + def run(*k,**kw): + r=ret(self,*k,**kw) + if not r: + self.fatal('requirement failure') + return r + return run + raise AttributeError,'No such method %r'%name + return getattr(self,name) + def eval_rules(self,rules): + self.rules=Utils.to_list(rules) + for x in self.rules: + f=getattr(self,x) + try: + f() + except Exception,e: + ret=self.err_handler(x,e) + if ret==BREAK: + break + elif ret==CONTINUE: + continue + else: + raise + def err_handler(self,fun,error): + pass +def conf(f): + setattr(ConfigurationContext,f.__name__,f) + return f +def conftest(f): + ConfigurationContext.tests[f.__name__]=f + return conf(f) + --- /dev/null +++ radare-1.5.2/wafadmin/Environment.py @@ -0,0 +1,126 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import os,copy,re +import Logs,Options +from Constants import* +re_imp=re.compile('^(#)*?([^#=]*?)\ =\ (.*?)$',re.M) +class Environment(object): + __slots__=("table","parent") + def __init__(self): + self.table={} + if Options.commands['configure']: + self.table['PREFIX']=os.path.abspath(Options.options.prefix) + def __contains__(self,key): + if key in self.table:return True + try:return self.parent.__contains__(key) + except AttributeError:return False + def __str__(self): + keys=set() + cur=self + while cur: + keys.update(cur.table.keys()) + cur=getattr(cur,'parent',None) + keys=list(keys) + keys.sort() + return"\n".join(["%r %r"%(x,self.__getitem__(x))for x in keys]) + def set_variant(self,name): + self.table[VARIANT]=name + def variant(self): + env=self + while 1: + try: + return env.table[VARIANT] + except KeyError: + try:env=env.parent + except AttributeError:return DEFAULT + def copy(self): + newenv=Environment() + if Options.commands['configure']: + if self['PREFIX']:del newenv.table['PREFIX'] + newenv.parent=self + return newenv + def __getitem__(self,key): + x=self.table.get(key,None) + if not x is None:return x + try: + u=self.parent + except AttributeError: + return[] + else: + return u[key] + def __setitem__(self,key,value): + self.table[key]=value + def get_flat(self,key): + s=self[key] + if not s:return'' + elif isinstance(s,list):return' '.join(s) + else:return s + def _get_list_value_for_modification(self,key): + try: + value=self.table[key] + except KeyError: + try:value=self.parent[key] + except AttributeError:value=[] + if isinstance(value,list): + value=copy.copy(value) + else: + value=[value] + self.table[key]=value + return value + else: + if not isinstance(value,list): + value=[value] + self.table[key]=value + return value + def append_value(self,var,value): + current_value=self._get_list_value_for_modification(var) + if isinstance(value,list): + current_value.extend(value) + else: + current_value.append(value) + def prepend_value(self,var,value): + current_value=self._get_list_value_for_modification(var) + if isinstance(value,list): + current_value=value+current_value + self.table[var]=current_value + else: + current_value.insert(0,value) + def append_unique(self,var,value): + current_value=self._get_list_value_for_modification(var) + if isinstance(value,list): + for value_item in value: + if value_item not in current_value: + current_value.append(value_item) + else: + if value not in current_value: + current_value.append(value) + def store(self,filename): + file=open(filename,'w') + table_list=[] + env=self + while 1: + table_list.insert(0,env.table) + try:env=env.parent + except AttributeError:break + merged_table={} + for table in table_list: + merged_table.update(table) + keys=merged_table.keys() + keys.sort() + for k in keys:file.write('%s = %r\n'%(k,merged_table[k])) + file.close() + def load(self,filename): + tbl=self.table + file=open(filename,'r') + code=file.read() + file.close() + for m in re_imp.finditer(code): + g=m.group + tbl[g(2)]=eval(g(3)) + Logs.debug('env: %s'%str(self.table)) + def get_destdir(self): + if self.__getitem__('NOINSTALL'):return'' + return Options.options.destdir + --- /dev/null +++ radare-1.5.2/wafadmin/TaskGen.py @@ -0,0 +1,279 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import os,types,traceback,copy +import Build,Task,Utils,Logs,Options +from Logs import debug,error,warn +from Constants import* +typos={'sources':'source','targets':'target','include':'includes','define':'defines','importpath':'importpaths','install_var':'install_path','install_subdir':'install_path','inst_var':'install_path','inst_dir':'install_path',} +class register_obj(type): + def __init__(cls,name,bases,dict): + super(register_obj,cls).__init__(name,bases,dict) + name=cls.__name__ + suffix='_taskgen' + if name.endswith(suffix): + task_gen.classes[name.replace(suffix,'')]=cls +class task_gen(object): + __metaclass__=register_obj + mappings={} + mapped={} + prec={} + traits={} + classes={} + idx={} + def __init__(self,*kw,**kwargs): + self.prec={} + self.source='' + self.target='' + self.meths=[] + self.mappings={} + self.features=list(kw) + self.tasks=[] + self.chmod=0644 + self._install_path=UNDEFINED + if Options.is_install: + self.inst_files=[] + self.allnodes=[] + self.env=Build.bld.env.copy() + self.path=Build.bld.path + self.name='' + Build.bld.all_task_gen.append(self) + self.idx=task_gen.idx[self.path.id]=task_gen.idx.get(self.path.id,0)+1 + for key,val in kwargs.iteritems(): + setattr(self,key,val) + def __str__(self): + return(""%(self.name or self.target,self.__class__.__name__,str(self.path))) + def __setattr__(self,name,attr): + real=typos.get(name,name) + if real!=name: + warn('typo %s -> %s'%(name,real)) + if Logs.verbose>0: + traceback.print_stack() + object.__setattr__(self,real,attr) + def to_list(self,value): + if type(value)is types.StringType:return value.split() + else:return value + def apply_core(self): + find_resource=self.path.find_resource + for filename in self.to_list(self.source): + x=self.get_hook(filename) + if x: + x(self,filename) + else: + node=find_resource(filename) + if not node:raise Utils.WafError("source not found: '%s' in '%s'"%(filename,str(self.path))) + self.allnodes.append(node) + for node in self.allnodes: + filename=node.name + k=max(0,filename.rfind('.')) + x=self.get_hook(filename[k:]) + if not x: + raise Utils.WafError("Do not know how to process %s in %s, mappings are %s"%(str(node),str(self.__class__),str(self.__class__.mappings))) + x(self,node) + def apply(self): + keys=set(self.meths) + for x in self.features+['*']: + keys.update(task_gen.traits.get(x,())) + prec={} + prec_tbl=self.prec or task_gen.prec + for x in prec_tbl: + if x in keys: + prec[x]=prec_tbl[x] + tmp=[] + for a in keys: + for x in prec.values(): + if a in x:break + else: + tmp.append(a) + out=[] + while tmp: + e=tmp.pop() + if e in keys:out.append(e) + try: + nlst=prec[e] + except KeyError: + pass + else: + del prec[e] + for x in nlst: + for y in prec: + if x in prec[y]: + break + else: + tmp.append(x) + if prec:raise Utils.WafError("graph has a cycle %s"%str(prec)) + out.reverse() + self.meths=out + if not out:out.append(self.apply_core.__name__) + debug('task_gen: posting %s %d'%(self,id(self))) + for x in out: + try: + v=getattr(self,x) + except AttributeError: + raise Utils.WafError("tried to retrieve %s which is not a valid method"%x) + debug('task_gen: -> %s (%d)'%(x,id(self))) + v() + def post(self): + if not self.name:self.name=self.target + if getattr(self,'posted',None): + return + self.apply() + debug('task_gen: posted %s'%self.name) + self.posted=True + def get_hook(self,ext): + try:return self.mappings[ext] + except KeyError: + try:return task_gen.mappings[ext] + except KeyError:return None + def create_task(self,name,env=None): + task=Task.TaskBase.classes[name](env or self.env) + self.tasks.append(task) + return task + def find_sources_in_dirs(self,dirnames,excludes=[],exts=[]): + lst=[] + err_msg="'%s' attribute must be a list.\n""Directories should be given either as a string separated by spaces, or as a list." + not_a_list=lambda x:x and type(x)is not types.ListType + if not_a_list(excludes): + raise Utils.WscriptError(err_msg%'excludes') + if not_a_list(exts): + raise Utils.WscriptError(err_msg%'exts') + dirnames=self.to_list(dirnames) + ext_lst=exts or self.mappings.keys()+task_gen.mappings.keys() + for name in dirnames: + anode=self.path.find_dir(name) + if not anode or not anode.is_child_of(Build.bld.srcnode): + raise Utils.WscriptError("Unable to use '%s' - either because it's not a relative path"", or it's not child of '%s'."%(name,Build.bld.srcnode)) + Build.bld.rescan(anode) + for name in Build.bld.cache_dir_contents[anode.id]: + (base,ext)=os.path.splitext(name) + if ext in ext_lst and not name in lst and not name in excludes: + lst.append((anode.relpath_gen(self.path)or'.')+os.path.sep+name) + lst.sort() + self.source=self.to_list(self.source) + if not self.source:self.source=lst + else:self.source+=lst + def clone(self,env): + newobj=task_gen() + for x in self.__dict__: + if x in["env"]: + continue + elif x in["path","features"]: + setattr(newobj,x,getattr(self,x)) + else: + setattr(newobj,x,copy.copy(getattr(self,x))) + newobj.__class__=self.__class__ + if type(env)is types.StringType: + newobj.env=Build.bld.all_envs[env].copy() + else: + newobj.env=env.copy() + Build.bld.all_task_gen.append(newobj) + return newobj + def get_inst_path(self): + k=self._install_path + if k==UNDEFINED: + return getattr(self,'default_install_path',k) + return k + def set_inst_path(self,val): + self._install_path=val + install_path=property(get_inst_path,set_inst_path) +def declare_extension(var,func): + try: + for x in var: + task_gen.mappings[x]=func + except: + raise Utils.WscriptError('declare extension takes either a list or a string %s'%str(var)) + task_gen.mapped[func.__name__]=func +def declare_order(*k): + assert(len(k)>1) + n=len(k)-1 + for i in xrange(n): + f1=k[i] + f2=k[i+1] + try: + if not f1 in task_gen.prec[f2]:task_gen.prec[f2].append(f1) + except: + task_gen.prec[f2]=[f1] +def declare_chain(name='',action='',ext_in='',ext_out='',reentrant=1,color='BLUE',install=0,before=[],after=[],decider=None): + if type(action)is types.StringType: + act=Task.simple_task_type(name,action,color=color) + else: + act=Task.task_type_from_func(name,action,color=color) + name=action.name + act.ext_in=tuple(Utils.to_list(ext_in)) + act.ext_out=tuple(Utils.to_list(ext_out)) + act.before=Utils.to_list(before) + act.after=Utils.to_list(after) + def x_file(self,node): + if decider: + ext=decider(self,node) + elif type(ext_out)is types.StringType: + ext=ext_out + if type(ext)is types.StringType: + out_source=node.change_ext(ext) + if reentrant: + self.allnodes.append(out_source) + elif type(ext)==types.ListType: + out_source=[node.change_ext(x)for x in ext] + if reentrant: + for i in xrange(reentrant): + self.allnodes.append(out_source[i]) + else: + raise Utils.WafError("do not know how to process %s"%str(ext)) + tsk=self.create_task(name) + tsk.set_inputs(node) + tsk.set_outputs(out_source) + if Options.is_install and install: + tsk.install=install + declare_extension(act.ext_in,x_file) +def bind_feature(name,methods): + lst=Utils.to_list(methods) + try: + l=task_gen.traits[name] + except KeyError: + l=set() + task_gen.traits[name]=l + l.update(lst) +def taskgen(func): + setattr(task_gen,func.__name__,func) +def feature(*k): + def deco(func): + for name in k: + try: + l=task_gen.traits[name] + except KeyError: + l=set() + task_gen.traits[name]=l + l.update([func.__name__]) + return func + return deco +def before(fun_name): + def deco(func): + try: + if not func.__name__ in task_gen.prec[fun_name]:task_gen.prec[fun_name].append(func.__name__) + except KeyError: + task_gen.prec[fun_name]=[func.__name__] + return func + return deco +def after(fun_name): + def deco(func): + try: + if not fun_name in task_gen.prec[func.__name__]:task_gen.prec[func.__name__].append(fun_name) + except KeyError: + task_gen.prec[func.__name__]=[fun_name] + return func + return deco +def extension(var): + if type(var)is types.ListType: + pass + elif type(var)is types.StringType: + var=[var] + else: + raise Utils.WafError('declare extension takes either a list or a string %s'%str(var)) + def deco(func): + for x in var: + task_gen.mappings[x]=func + task_gen.mapped[func.__name__]=func + return func + return deco + --- /dev/null +++ radare-1.5.2/wafadmin/Runner.py @@ -0,0 +1,143 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import sys,random,time,threading,Queue,traceback +import Build,Utils,Logs,Options +import pproc +from Logs import debug,error +from Constants import* +def print_log(msg,nl='\n'): + f=Build.bld.log + if f: + f.write(msg) + f.write(nl) + f.flush() +def printout(s): + if not Build.bld.log: + sys.stdout.write(s) + sys.stdout.flush() + print_log(s,nl='') +def exec_command(s,shell=1): + debug('runner: system command -> %s'%s) + log=Build.bld.log + if log or Logs.verbose:printout(s+'\n') + proc=pproc.Popen(s,shell=shell,stdout=log,stderr=log) + return proc.wait() +if sys.platform=="win32": + old_log=exec_command + def exec_command(s,shell=1): + if len(s)<2000:return old_log(s,shell=shell) + log=Build.bld.log + if log or Logs.verbose:printout(s+'\n') + startupinfo=pproc.STARTUPINFO() + startupinfo.dwFlags|=pproc.STARTF_USESHOWWINDOW + proc=pproc.Popen(s,shell=False,startupinfo=startupinfo) + return proc.wait() +class TaskConsumer(threading.Thread): + def __init__(self,m): + threading.Thread.__init__(self) + self.setDaemon(1) + self.master=m + self.start() + def run(self): + m=self.master + while 1: + tsk=m.ready.get() + if m.stop: + m.out.put(tsk) + continue + try: + printout(tsk.display()) + if tsk.__class__.stat:ret=tsk.__class__.stat(tsk) + else:ret=tsk.call_run() + except Exception,e: + tsk.err_msg=e.message + tsk.hasrun=EXCEPTION + m.error_handler(tsk) + m.out.put(tsk) + continue + if ret: + tsk.err_code=ret + tsk.hasrun=CRASHED + else: + try: + tsk.post_run() + except OSError: + tsk.hasrun=MISSING + else: + tsk.hasrun=SUCCESS + if tsk.hasrun!=SUCCESS: + m.error_handler(tsk) + m.out.put(tsk) +class Parallel(object): + def __init__(self,bld,j=2): + self.numjobs=j + self.manager=bld.task_manager + self.total=self.manager.total() + self.outstanding=[] + self.maxjobs=sys.maxint + self.frozen=[] + self.ready=Queue.Queue(0) + self.out=Queue.Queue(0) + self.count=0 + self.processed=0 + self.consumers=None + self.stop=False + self.error=False + def get_next(self): + return self.outstanding.pop(0) + def postpone(self,tsk): + if random.randint(0,1): + self.frozen.insert(0,tsk) + else: + self.frozen.append(tsk) + def refill_task_list(self): + while self.count>self.numjobs+5: + self.get_out() + self.outstanding=self.frozen + self.frozen=[] + if not self.outstanding: + while self.count>0:self.get_out() + (self.maxjobs,self.outstanding)=self.manager.get_next_set() + def get_out(self): + self.manager.add_finished(self.out.get()) + self.count-=1 + def error_handler(self,tsk): + if not Options.options.keep: + self.stop=True + self.error=True + def start(self): + while not self.stop: + while self.count>=self.maxjobs: + self.get_out() + if not self.outstanding: + self.refill_task_list() + if self.maxjobs is None: + break + tsk=self.get_next() + if tsk.hasrun: + self.processed+=1 + self.manager.add_finished(tsk) + try: + st=tsk.runnable_status() + except Exception,e: + tsk.err_msg="TODO print the exception here" + tsk.hasrun=EXCEPTION + self.error_handler(tsk) + if st==ASK_LATER: + self.postpone(tsk) + elif st==SKIP_ME: + self.processed+=1 + tsk.hasrun=SKIPPED + self.manager.add_finished(tsk) + else: + tsk.position=(self.processed,self.total) + self.count+=1 + self.ready.put(tsk) + self.processed+=1 + if not self.consumers: + self.consumers=[TaskConsumer(self)for i in xrange(self.numjobs)] + while self.count: + self.get_out() + --- /dev/null +++ radare-1.5.2/wafadmin/Utils.py @@ -0,0 +1,245 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,imp,types,string,errno,traceback,inspect,re +from UserDict import UserDict +import Logs,pproc +from Constants import* +class WafError(Exception): + def __init__(self,message): + self.message=message + self.stack=traceback.extract_stack() + Exception.__init__(self,self.message) + def __str__(self): + return self.message +class WscriptError(WafError): + def __init__(self,message,wscript_file=None): + self.message='' + if wscript_file: + self.wscript_file=wscript_file + self.wscript_line=None + else: + (self.wscript_file,self.wscript_line)=self.locate_error() + if self.wscript_file: + self.message+="%s:"%self.wscript_file + if self.wscript_line: + self.message+="%s:"%self.wscript_line + self.message+=" error: %s"%message + WafError.__init__(self,self.message) + def locate_error(self): + stack=traceback.extract_stack() + stack.reverse() + for frame in stack: + file_name=os.path.basename(frame[0]) + is_wscript=(file_name==WSCRIPT_FILE or file_name==WSCRIPT_BUILD_FILE) + if is_wscript: + return(frame[0],frame[1]) + return(None,None) +indicator=sys.platform=='win32'and'\x1b[A\x1b[K%s%s%s\r'or'\x1b[K%s%s%s\r' +try: + from fnv import new as md5 + import Constants + Constants.SIG_NIL='signofnv' + def h_file(filename): + m=md5() + try: + m.hfile(filename) + x=m.digest() + if x is None:raise OSError,"not a file" + return x + except SystemError: + raise OSError,"not a file"+filename +except ImportError: + try: + from hashlib import md5 + except ImportError: + from md5 import md5 + def h_file(filename): + f=file(filename,'rb') + m=md5() + readBytes=100000 + while(readBytes): + readString=f.read(readBytes) + m.update(readString) + readBytes=len(readString) + f.close() + return m.digest() +class ordered_dict(UserDict): + def __init__(self,dict=None): + self.allkeys=[] + UserDict.__init__(self,dict) + def __delitem__(self,key): + self.allkeys.remove(key) + UserDict.__delitem__(self,key) + def __setitem__(self,key,item): + if key not in self.allkeys:self.allkeys.append(key) + UserDict.__setitem__(self,key,item) +listdir=os.listdir +if sys.platform=="win32": + def listdir_win32(s): + if not os.path.isdir(s): + e=OSError() + e.errno=errno.ENOENT + raise e + return os.listdir(s) + listdir=listdir_win32 +def waf_version(mini=0x010000,maxi=0x100000): + ver=HEXVERSION + try:min_val=mini+0 + except TypeError:min_val=int(mini.replace('.','0'),16) + if min_val>ver: + Logs.error("waf version should be at least %s (%s found)"%(mini,ver)) + sys.exit(0) + try:max_val=maxi+0 + except TypeError:max_val=int(maxi.replace('.','0'),16) + if max_val= 2.3 but the raw source requires Python 2.4" +def to_list(sth): + if type(sth)is types.ListType: + return sth + else: + return sth.split() +g_loaded_modules={} +g_module=None +def load_module(file_path,name=WSCRIPT_FILE): + try: + return g_loaded_modules[file_path] + except KeyError: + pass + module=imp.new_module(name) + try: + file=open(file_path,'r') + except(IOError,OSError): + raise WscriptError('The file %s could not be opened!'%file_path) + module_dir=os.path.dirname(file_path) + sys.path.insert(0,module_dir) + exec file in module.__dict__ + sys.path.remove(module_dir) + if file:file.close() + g_loaded_modules[file_path]=module + return module +def set_main_module(file_path): + global g_module + g_module=load_module(file_path,'wscript_main') +def to_hashtable(s): + tbl={} + lst=s.split('\n') + for line in lst: + if not line:continue + mems=line.split('=') + tbl[mems[0]]=mems[1] + return tbl +def get_term_cols(): + return 80 +try: + import struct,fcntl,termios +except ImportError: + pass +else: + if sys.stdout.isatty(): + def myfun(): + dummy_lines,cols=struct.unpack("HHHH",fcntl.ioctl(sys.stdout.fileno(),termios.TIOCGWINSZ,struct.pack("HHHH",0,0,0,0)))[:2] + return cols + try: + myfun() + except IOError: + pass + else: + get_term_cols=myfun +rot_idx=0 +rot_chr=['\\','|','/','-'] +def split_path(path): + if not path:return[''] + return path.split('/') +if sys.platform=='win32': + def split_path(path): + h,t=os.path.splitunc(path) + if not h:return __split_dirs(t) + return[h]+__split_dirs(t)[1:] + def __split_dirs(path): + h,t=os.path.split(path) + if not h:return[t] + if h==path:return[h.replace('\\','')] + if not t:return __split_dirs(h) + else:return __split_dirs(h)+[t] +def copy_attrs(orig,dest,names,only_if_set=False): + for a in to_list(names): + u=getattr(orig,a,()) + if u or not only_if_set: + setattr(dest,a,u) +_quote_define_name_translation=None +def quote_define_name(path): + global _quote_define_name_translation + if _quote_define_name_translation is None: + invalid_chars=[chr(x)for x in xrange(256)] + for valid in string.digits+string.uppercase:invalid_chars.remove(valid) + _quote_define_name_translation=string.maketrans(''.join(invalid_chars),'_'*len(invalid_chars)) + return string.translate(string.upper(path),_quote_define_name_translation) +def quote_whitespace(path): + return(path.strip().find(' ')>0 and'"%s"'%path or path).replace('""','"') +def trimquotes(s): + if not s:return'' + s=s.rstrip() + if s[0]=="'"and s[-1]=="'":return s[1:-1] + return s +def h_list(lst): + m=md5() + m.update(str(lst)) + return m.digest() +def h_fun(fun): + try: + return fun.code + except AttributeError: + try: + hh=inspect.getsource(fun) + except IOError: + hh="nocode" + try: + fun.code=hh + except AttributeError: + pass + return hh +_hash_blacklist_types=(types.BuiltinFunctionType,types.ModuleType,types.FunctionType,types.ClassType,types.TypeType,types.NoneType,) +def hash_function_with_globals(prevhash,func): + assert type(func)is types.FunctionType + for name,value in func.func_globals.iteritems(): + if type(value)in _hash_blacklist_types: + continue + if isinstance(value,type): + continue + try: + prevhash=hash((prevhash,name,value)) + except TypeError: + pass + return hash((prevhash,inspect.getsource(func))) +def pprint(col,str,label=''): + print"%s%s%s %s"%(Logs.colors(col),str,Logs.colors.NORMAL,label) +def check_dir(dir): + try: + os.stat(dir) + except OSError: + try: + os.makedirs(dir) + except OSError,e: + raise WafError("Cannot create folder '%s' (original error: %s)"%(dir,e)) +def cmd_output(cmd,e=None): + p=pproc.Popen(cmd,stdout=pproc.PIPE,shell=True,env=e) + output=p.communicate()[0] + if p.returncode: + msg="command execution failed: %s -> %r"%(cmd,str(output)) + raise ValueError,msg + return output +reg_subst=re.compile(r"(\\\\)|(\\\$)|\$\{([^}]+)\}") +def subst_vars(expr,params): + def repl_var(m): + if m.group(1): + return'\\' + if m.group(2): + return'$' + return params[m.group(3)] + return reg_subst.sub(repl_var,expr) + --- /dev/null +++ radare-1.5.2/wafadmin/Node.py @@ -0,0 +1,341 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,types +import Utils,Build +UNDEFINED=0 +DIR=1 +FILE=2 +BUILD=3 +type_to_string={UNDEFINED:"unk",DIR:"dir",FILE:"src",BUILD:"bld"} +class Node(object): + __slots__=("name","parent","id","childs") + def __init__(self,name,parent,node_type=UNDEFINED): + self.name=name + self.parent=parent + Build.bld.id_nodes+=4 + self.id=Build.bld.id_nodes+node_type + if node_type==DIR:self.childs={} + if Utils.split_path(name)[0]!=name: + raise Utils.WafError('name forbidden '+name) + if parent and name in parent.childs: + raise Utils.WafError('node %s exists in the parent files %s already'%(name,str(parent))) + if parent:parent.childs[name]=self + def __str__(self): + if not self.parent:return'' + return"%s://%s"%(type_to_string[self.id&3],self.abspath()) + def __repr__(self): + return self.__str__() + def __hash__(self): + raise Utils.WafError('nodes, you are doing it wrong') + def get_type(self): + return self.id&3 + def set_type(self,t): + self.id=self.id+t-self.id&3 + def dirs(self): + return[x for x in self.childs.values()if x.id&3==DIR] + def files(self): + return[x for x in self.childs.values()if x.id&3==FILE] + def get_dir(self,name,default=None): + node=self.childs.get(name,None) + if not node or node.id&3!=DIR:return default + return node + def get_file(self,name,default=None): + node=self.childs.get(name,None) + if not node or node.id&3!=FILE:return default + return node + def get_build(self,name,default=None): + node=self.childs.get(name,None) + if not node or node.id&3!=BUILD:return default + return node + def find_resource(self,lst): + if type(lst)is types.StringType: + lst=Utils.split_path(lst) + if not lst[:-1]: + parent=self + else: + parent=self.find_dir(lst[:-1]) + if not parent:return None + Build.bld.rescan(parent) + name=lst[-1] + node=parent.childs.get(name,None) + if node: + tp=node.id&3 + if tp==FILE or tp==BUILD: + return node + tree=Build.bld + if not name in tree.cache_dir_contents[parent.id]: + return None + path=parent.abspath()+os.sep+name + try: + st=Utils.h_file(path) + except IOError: + return None + child=Node(name,parent,FILE) + tree.node_sigs[0][child.id]=st + return child + def find_or_declare(self,lst): + if type(lst)is types.StringType: + lst=Utils.split_path(lst) + if not lst[:-1]: + parent=self + else: + parent=self.find_dir(lst[:-1]) + if not parent:return None + Build.bld.rescan(parent) + name=lst[-1] + node=parent.childs.get(name,None) + if node: + tp=node.id&3 + if tp!=BUILD: + raise Utils.WafError("find_or_declare returns a build node, not a source nor a directory"+str(lst)) + return node + node=Node(name,parent,BUILD) + return node + def find_dir(self,lst): + if type(lst)is types.StringType: + lst=Utils.split_path(lst) + current=self + for name in lst: + Build.bld.rescan(current) + prev=current + if not current.parent and name==current.name: + continue + elif not name: + continue + elif name=='.': + continue + elif name=='..': + current=current.parent or current + else: + current=prev.childs.get(name,None) + if current is None: + dir_cont=Build.bld.cache_dir_contents + if prev.id in dir_cont and name in dir_cont[prev.id]: + current=Node(name,prev,DIR) + else: + return None + return current + def ensure_dir_node_from_path(self,lst): + if type(lst)is types.StringType: + lst=Utils.split_path(lst) + current=self + for name in lst: + if not name: + continue + elif name=='.': + continue + elif name=='..': + current=current.parent or current + else: + prev=current + current=prev.childs.get(name,None) + if current is None: + current=Node(name,prev,DIR) + return current + def exclusive_build_node(self,path): + lst=Utils.split_path(path) + name=lst[-1] + if len(lst)>1: + parent=None + try: + parent=self.find_dir(lst[:-1]) + except OSError: + pass + if not parent: + parent=self.ensure_dir_node_from_path(lst[:-1]) + Build.bld.cache_scanned_folders[parent.id]=1 + else: + try: + Build.bld.rescan(parent) + except OSError: + pass + else: + parent=self + node=parent.childs.get(name,None) + if not node: + node=Node(name,parent,BUILD) + return node + def path_to_parent(self,parent): + lst=[] + p=self + h1=parent.height() + h2=p.height() + while h2>h1: + h2-=1 + lst.append(p.name) + p=p.parent + if lst: + lst.reverse() + ret=os.path.join(*lst) + else: + ret='' + return ret + def find_ancestor(self,node): + dist=self.height()-node.height() + if dist<0:return node.find_ancestor(self) + cand=self + while dist>0: + cand=cand.parent + dist-=1 + if cand==node:return cand + cursor=node + while cand.parent: + cand=cand.parent + cursor=cursor.parent + if cand==cursor:return cand + def relpath_gen(self,going_to): + if self==going_to:return'.' + if going_to.parent==self:return'..' + ancestor=self.find_ancestor(going_to) + lst=[] + cand=self + while not cand.id==ancestor.id: + lst.append(cand.name) + cand=cand.parent + cand=going_to + while not cand.id==ancestor.id: + lst.append('..') + cand=cand.parent + lst.reverse() + return os.sep.join(lst) + def nice_path(self,env=None): + tree=Build.bld + ln=tree.launch_node() + name=self.name + if self.id&3==FILE:return self.relpath_gen(ln) + else:return os.path.join(tree.bldnode.relpath_gen(ln),env.variant(),self.relpath_gen(tree.srcnode)) + def debug(self): + print"========= debug node =============" + print"dirs are ",self.dirs() + print"files are",self.files() + print"======= end debug node ===========" + def is_child_of(self,node): + p=self + diff=self.height()-node.height() + while diff>0: + diff-=1 + p=p.parent + return p.id==node.id + def variant(self,env): + if not env:return 0 + elif self.id&3==FILE:return 0 + else:return env.variant() + def height(self): + d=self + val=0 + while d.parent: + d=d.parent + val+=1 + return val + def abspath(self,env=None): + if not self.name: + return'/' + variant=self.variant(env) + ret=Build.bld.cache_node_abspath[variant].get(self.id,None) + if ret:return ret + if not variant: + if not self.parent: + val=os.sep + elif not self.parent.name: + val=os.sep+self.name + else: + val=self.parent.abspath()+os.sep+self.name + else: + val=os.sep.join((Build.bld.bldnode.abspath(),env.variant(),self.path_to_parent(Build.bld.srcnode))) + Build.bld.cache_node_abspath[variant][self.id]=val + return val + def change_ext(self,ext): + name=self.name + k=name.rfind('.') + if k>=0: + name=name[:k]+ext + else: + name=name+ext + node=self.parent.childs.get(name,None) + if not node: + node=Node(name,self.parent,BUILD) + return node + def src_dir(self,env): + return self.parent.srcpath(env) + def bld_dir(self,env): + return self.parent.bldpath(env) + def bld_base(self,env): + s=self.name + s=s[:s.rfind('.')] + return os.path.join(self.bld_dir(env),s) + def bldpath(self,env=None): + if self.id&3==FILE: + return self.relpath_gen(Build.bld.bldnode) + if self.path_to_parent(Build.bld.srcnode)is not'': + return os.path.join(env.variant(),self.path_to_parent(Build.bld.srcnode)) + return env.variant() + def srcpath(self,env=None): + if self.id&3==BUILD: + return self.bldpath(env) + return self.relpath_gen(Build.bld.bldnode) + def read(self,env): + try: + file=open(self.abspath(env),'rb') + return file.read() + finally: + if file:file.close() + def dir(self,env): + return self.parent.abspath(env) + def file(self): + return self.name + def file_base(self): + s=self.name + if s.rfind('.')<0: + return s + return s[:s.rfind('.')] + def suffix(self): + s=self.name + if s.rfind('.')<0: + return s + return s[s.rfind('.'):] +if sys.platform=="win32": + def find_dir_win32(self,lst): + if type(lst)is types.StringType: + lst=Utils.split_path(lst) + current=self + for name in lst: + Build.bld.rescan(current) + prev=current + if not current.parent and name==current.name: + continue + if not name: + continue + elif name=='.': + continue + elif name=='..': + current=current.parent or current + else: + current=prev.childs.get(name,None) + if current is None: + if(name in Build.bld.cache_dir_contents[prev.id]or(not prev.parent and name[1]==":")): + current=Node(name,prev,DIR) + else: + return None + return current + Node.find_dir=find_dir_win32 + def abspath_win32(self,env=None): + variant=self.variant(env) + ret=Build.bld.cache_node_abspath[variant].get(self.id,None) + if ret:return ret + if not variant: + cur=self + lst=[] + while cur: + lst.append(cur.name) + cur=cur.parent + lst.reverse() + val=os.sep.join(lst) + else: + val=os.sep.join((Build.bld.bldnode.abspath(),env.variant(),self.path_to_parent(Build.bld.srcnode))) + if val.startswith("\\"):val=val[1:] + if val.startswith("\\"):val=val[1:] + Build.bld.cache_node_abspath[variant][self.id]=val + return val + Node.abspath=abspath_win32 + --- /dev/null +++ radare-1.5.2/wafadmin/Logs.py @@ -0,0 +1,80 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,re,logging,traceback,sys,Utils +from Constants import* +zones='' +verbose=0 +colors_lst={'USE':True,'BOLD':'\x1b[01;1m','RED':'\x1b[01;91m','GREEN':'\x1b[32m','YELLOW':'\x1b[33m','PINK':'\x1b[35m','BLUE':'\x1b[01;34m','CYAN':'\x1b[36m','NORMAL':'\x1b[0m','cursor_on':'\x1b[?25h','cursor_off':'\x1b[?25l',} +if(sys.platform=='win32')or('NOCOLOR'in os.environ)or(os.environ.get('TERM','dumb')in['dumb','emacs'])or(not sys.stdout.isatty()): + colors_lst['USE']=False +def get_color(cl): + if not colors_lst['USE']:return'' + return colors_lst.get(cl,'') +class foo(object): + def __getattr__(self,a): + return get_color(a) + def __call__(self,a): + return get_color(a) +colors=foo() +re_log=re.compile(r'(\w+): (.*)',re.M) +class log_filter(logging.Filter): + def __init__(self,name=None): + pass + def filter(self,rec): + rec.c1=colors.PINK + rec.c2=colors.NORMAL + rec.zone=rec.module + if rec.levelno>=logging.INFO: + if rec.levelno>=logging.ERROR: + rec.c1=colors.RED + else: + rec.c1=colors.GREEN + return True + zone='' + m=re_log.match(rec.msg) + if m: + zone=rec.zone=m.group(1) + rec.msg=m.group(2) + if zones: + return getattr(rec,'zone','')in zones or'*'in zones + elif not verbose>2: + return False + return True +class formatter(logging.Formatter): + def __init__(self): + logging.Formatter.__init__(self,LOG_FORMAT,HOUR_FORMAT) + def format(self,rec): + if rec.levelno>=logging.WARNING or rec.levelno==logging.INFO: + return'%s%s%s'%(rec.c1,rec.msg,rec.c2) + return logging.Formatter.format(self,rec) +def debug(msg): + msg=msg.replace('\n',' ') + if verbose: + logging.debug(msg) +def error(msg): + logging.error(msg) + if verbose: + if isinstance(msg,Utils.WafError): + st=msg.stack + else: + st=traceback.extract_stack() + if st: + st=st[:-1] + buf=[] + for filename,lineno,name,line in st: + buf.append(' File "%s", line %d, in %s'%(filename,lineno,name)) + if line: + buf.append(' %s'%line.strip()) + if buf:logging.error("\n".join(buf)) +warn=logging.warn +info=logging.info +def init_log(): + log=logging.getLogger() + log.handlers=[] + hdlr=logging.StreamHandler() + hdlr.setFormatter(formatter()) + log.addHandler(hdlr) + log.addFilter(log_filter()) + log.setLevel(logging.DEBUG) + --- /dev/null +++ radare-1.5.2/wafadmin/Build.py @@ -0,0 +1,495 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import os,sys,cPickle,types,imp,errno,re,glob,gc,time,shutil +import Runner,TaskGen,Node,Scripting,Utils,Environment,Task,Logs,Options +from Logs import debug,error,info +from Constants import* +SAVED_ATTRS='root srcnode bldnode node_sigs node_deps raw_deps task_sigs id_nodes'.split() +g_modcache={} +bld=None +class BuildError(Utils.WafError): + def __init__(self,b=None,t=[]): + self.bld=b + self.tasks=t + self.ret=1 + Utils.WafError.__init__(self,self.format_error()) + def format_error(self): + lst=['Build failed'] + for tsk in self.tasks: + txt=tsk.format_error() + if txt:lst.append(txt) + return'\n'.join(lst) +class BuildContext(object): + def __init__(self): + global bld + bld=self + self.task_manager=Task.TaskManager() + self.id_nodes=0 + self.all_envs={} + self.bdir='' + self.path=None + self.cache_node_abspath={} + self.cache_scanned_folders={} + self.uninstall=[] + for v in'cache_node_abspath task_sigs node_deps raw_deps node_sigs'.split(): + var={} + setattr(self,v,var) + self.cache_dir_contents={} + self.all_task_gen=[] + self.task_gen_cache_names={} + self.cache_sig_vars={} + self.log=None + self.root=None + self.srcnode=None + self.bldnode=None + def load(self): + code='' + try: + file=open(os.path.join(self.cachedir,'build.config.py'),'r') + code=file.read() + file.close() + except(IOError,OSError): + pass + else: + re_imp=re.compile('^(#)*?([^#=]*?)\ =\ (.*?)$',re.M) + for m in re_imp.finditer(code): + g=m.group + if g(2)=='version': + if eval(g(3))1:raise + else:sys.exit(68) + except Exception: + dw() + raise + else: + dw() + self.save() + if self.generator.error: + os.chdir(self.srcnode.abspath()) + raise BuildError(self,self.task_manager.tasks_done) + os.chdir(self.srcnode.abspath()) + def install(self): + debug('build: install called') + self.flush() + if Options.commands['uninstall']: + lst=[] + for x in self.uninstall: + dir=os.path.dirname(x) + if not dir in lst:lst.append(dir) + lst.sort() + lst.reverse() + nlst=[] + for y in lst: + x=y + while len(x)>4: + if not x in nlst:nlst.append(x) + x=os.path.dirname(x) + nlst.sort() + nlst.reverse() + for x in nlst: + try:os.rmdir(x) + except OSError:pass + def add_subdirs(self,dirs): + for dir in Utils.to_list(dirs): + if dir:Scripting.add_subdir(dir,self) + def new_task_gen(self,*k,**kw): + if len(k)==0:return TaskGen.task_gen() + cls_name=k[0] + try:cls=TaskGen.task_gen.classes[cls_name] + except KeyError:raise Utils.WscriptError('%s is not a valid task generator -> %s'%(cls_name,[x for x in TaskGen.task_gen.classes])) + else:return cls(*k,**kw) + def load_envs(self): + try: + lst=Utils.listdir(self.cachedir) + except OSError,e: + if e.errno==errno.ENOENT: + raise Utils.WafError('The project was not configured: run "waf configure" first!') + else: + raise + if not lst: + raise Utils.WafError('The cache directory is empty: reconfigure the project') + for file in lst: + if file.endswith(CACHE_SUFFIX): + env=Environment.Environment() + env.load(os.path.join(self.cachedir,file)) + name=file.split('.')[0] + self.all_envs[name]=env + self.init_variants() + for env in self.all_envs.values(): + for f in env['dep_files']: + newnode=self.srcnode.find_or_declare(f) + try: + hash=Utils.h_file(newnode.abspath(env)) + except(IOError,AttributeError): + error("cannot find "+f) + hash=SIG_NIL + self.node_sigs[env.variant()][newnode.id]=hash + def setup(self,tool,tooldir=None,funs=None): + if type(tool)is types.ListType: + for i in tool:self.setup(i,tooldir) + return + if not tooldir:tooldir=Options.tooldir + file=None + key=str((tool,tooldir)) + module=g_modcache.get(key,None) + if not module: + file,name,desc=imp.find_module(tool,tooldir) + module=imp.load_module(tool,file,name,desc) + g_modcache[key]=module + if hasattr(module,"setup"):module.setup(self) + if file:file.close() + def init_variants(self): + debug('build: init variants') + lstvariants=[] + for env in self.all_envs.values(): + if not env.variant()in lstvariants: + lstvariants.append(env.variant()) + self._variants=lstvariants + debug('build: list of variants is %s'%str(lstvariants)) + for name in lstvariants+[0]: + for v in'node_sigs cache_node_abspath'.split(): + var=getattr(self,v) + if not name in var: + var[name]={} + def load_dirs(self,srcdir,blddir): + assert(os.path.isabs(srcdir)) + assert(os.path.isabs(blddir)) + self.cachedir=os.path.join(blddir,CACHE_DIR) + if srcdir==blddir: + raise Utils.WafError("build dir must be different from srcdir: %s <-> %s "%(srcdir,blddir)) + self.bdir=blddir + self.load() + if not self.root: + self.root=Node.Node('',None,Node.DIR) + if not self.srcnode: + self.srcnode=self.root.ensure_dir_node_from_path(srcdir) + debug('build: srcnode is %s and srcdir %s'%(str(self.srcnode.name),srcdir)) + self.path=self.srcnode + try:os.makedirs(blddir) + except OSError:pass + if not self.bldnode: + self.bldnode=self.root.ensure_dir_node_from_path(blddir) + self.init_variants() + def rescan(self,src_dir_node): + if self.cache_scanned_folders.get(src_dir_node.id,None):return + self.cache_scanned_folders[src_dir_node.id]=1 + if hasattr(self,'repository'):self.repository(src_dir_node) + if sys.platform=="win32"and not src_dir_node.name: + return + self.listdir_src(src_dir_node,src_dir_node.abspath()) + h1=self.srcnode.height() + h2=src_dir_node.height() + lst=[] + child=src_dir_node + while h2>h1: + lst.append(child.name) + child=child.parent + h2-=1 + lst.reverse() + for variant in self._variants: + sub_path=os.path.join(self.bldnode.abspath(),variant,*lst) + try: + self.listdir_bld(src_dir_node,sub_path,variant) + except OSError: + dict=self.node_sigs[variant] + for node in src_dir_node.childs.values(): + if node.id in dict: + dict.__delitem__(node.id) + if node.id!=self.bldnode.id: + src_dir_node.childs.__delitem__(node.name) + os.makedirs(sub_path) + def listdir_src(self,parent_node,path): + listed_files=set(Utils.listdir(path)) + self.cache_dir_contents[parent_node.id]=listed_files + debug('build: folder contents '+str(listed_files)) + node_names=set([x.name for x in parent_node.childs.values()if x.id&3==Node.FILE]) + cache=self.node_sigs[0] + to_keep=listed_files&node_names + for x in to_keep: + node=parent_node.childs[x] + try: + cache[node.id]=Utils.h_file(path+os.sep+node.name) + except IOError: + raise Utils.WafError("The file %s is not readable or has become a dir"%node.abspath()) + to_remove=node_names-listed_files + if to_remove: + cache=self.node_sigs[0] + for name in to_remove: + nd=parent_node.childs[name] + if nd.id in cache: + cache.__delitem__(nd.id) + parent_node.childs.__delitem__(name) + def listdir_bld(self,parent_node,path,variant): + i_existing_nodes=[x for x in parent_node.childs.values()if x.id&3==Node.BUILD] + listed_files=set(Utils.listdir(path)) + node_names=set([x.name for x in i_existing_nodes]) + remove_names=node_names-listed_files + ids_to_remove=[x.id for x in i_existing_nodes if x.name in remove_names] + cache=self.node_sigs[variant] + for nid in ids_to_remove: + if nid in cache: + cache.__delitem__(nid) + def get_env(self): + return self.env_of_name('default') + def set_env(self,name,val): + self.all_envs[name]=val + env=property(get_env,set_env) + def add_manual_dependency(self,path,value): + h=getattr(self,'deps_man',{}) + if os.path.isabs(path): + node=self.root.find_resource(path) + else: + node=self.path.find_resource(path) + h[node]=value + self.deps_man=h + def launch_node(self): + try: + return self.p_ln + except AttributeError: + self.p_ln=self.root.find_dir(Options.launch_dir) + return self.p_ln + def glob(self,pattern,relative=True): + path=self.path.abspath() + files=[self.root.find_resource(x)for x in glob.glob(path+os.sep+pattern)] + if relative: + files=[x.path_to_parent(self.path)for x in files if x] + else: + files=[x.abspath()for x in files if x] + return files + def add_group(self): + self.flush(all=0) + self.task_manager.add_group() + def hash_env_vars(self,env,vars_lst): + idx=str(id(env))+str(vars_lst) + try:return self.cache_sig_vars[idx] + except KeyError:pass + lst=[env.get_flat(a)for a in vars_lst] + ret=Utils.h_list(lst) + debug("envhash: %s %s"%(ret.encode('hex'),str(lst))) + self.cache_sig_vars[idx]=ret + return ret + def name_to_obj(self,name): + cache=self.task_gen_cache_names + if not cache: + for x in self.all_task_gen: + if x.name: + cache[x.name]=x + elif not self.task_gen_cache_names.get(x,''): + cache[x.target]=x + return cache.get(name,None) + def flush(self,all=1): + self.ini=time.time() + self.task_gen_cache_names={} + self.name_to_obj('') + debug('build: delayed operation TaskGen.flush() called') + if Options.options.compile_targets: + debug('task_gen: posting objects listed in compile_targets') + targets_objects={} + for target_name in Options.options.compile_targets.split(','): + target_name=target_name.strip() + targets_objects[target_name]=self.name_to_obj(target_name) + if all and not targets_objects[target_name]:raise Utils.WafError("target '%s' does not exist"%target_name) + for target_obj in targets_objects.values(): + if target_obj: + target_obj.post() + else: + debug('task_gen: posting objects (normal)') + ln=self.launch_node() + if ln.is_child_of(self.bldnode)or not ln.is_child_of(self.srcnode): + ln=self.srcnode + for obj in self.all_task_gen: + if not obj.path.is_child_of(ln):continue + obj.post() + def env_of_name(self,name): + if not name: + error('env_of_name called with no name!') + return None + try: + return self.all_envs[name] + except KeyError: + error('no such environment: '+name) + return None + def progress_line(self,state,total,col1,col2): + n=len(str(total)) + Utils.rot_idx+=1 + ind=Utils.rot_chr[Utils.rot_idx%4] + ini=self.ini + pc=(100.*state)/total + eta=time.strftime('%H:%M:%S',time.gmtime(time.time()-ini)) + fs="[%%%dd/%%%dd][%%s%%2d%%%%%%s][%s]["%(n,n,ind) + left=fs%(state,total,col1,pc,col2) + right='][%s%s%s]'%(col1,eta,col2) + cols=Utils.get_term_cols()-len(left)-len(right)+2*len(col1)+2*len(col2) + if cols<7:cols=7 + ratio=int((cols*state)/total)-1 + bar=('='*ratio+'>').ljust(cols) + msg=Utils.indicator%(left,bar,right) + return msg + def do_install(self,src,tgt,chmod=0644): + if Options.commands['install']: + if not Options.options.force: + try: + t1=os.stat(tgt).st_mtime + t2=os.stat(src).st_mtime + except OSError: + pass + else: + if t1>=t2: + return False + srclbl=src.replace(self.srcnode.abspath(None)+os.sep,'') + info("* installing %s as %s"%(srclbl,tgt)) + try:os.remove(tgt) + except OSError:pass + try: + shutil.copy2(src,tgt) + os.chmod(tgt,chmod) + except IOError: + try: + os.stat(src) + except IOError: + error('File %r does not exist'%src) + raise Utils.WafError('Could not install the file %r'%tgt) + return True + elif Options.commands['uninstall']: + info("* uninstalling %s"%tgt) + self.uninstall.append(tgt) + try:os.remove(tgt) + except OSError:pass + return True + def get_install_path(self,path,env=None): + if not env:env=self.env + destdir=env.get_destdir() + destpath=Utils.subst_vars(path,env) + if destdir: + destpath=os.path.join(destdir,destpath.lstrip(os.sep)) + return destpath + def install_files(self,path,files,env=None,chmod=0644,relative_trick=False): + if not Options.is_install:return[] + if not path:return[] + node=self.path + if type(files)is types.StringType and'*'in files: + gl=node.abspath()+os.sep+files + lst=glob.glob(gl) + else: + lst=Utils.to_list(files) + env=env or self.env + destpath=self.get_install_path(path,env) + Utils.check_dir(destpath) + installed_files=[] + for filename in lst: + if not os.path.isabs(filename): + nd=node.find_resource(filename) + if not nd: + raise Utils.WafError("Unable to install the file `%s': not found in %s"%(filename,node)) + if relative_trick: + destfile=os.path.join(destpath,filename) + Utils.check_dir(os.path.dirname(destfile)) + else: + destfile=os.path.join(destpath,nd.name) + filename=nd.abspath(env) + else: + alst=Utils.split_path(filename) + destfile=os.path.join(destpath,alst[-1]) + if self.do_install(filename,destfile,chmod): + installed_files.append(destfile) + return installed_files + def install_as(self,path,srcfile,env=None,chmod=0644): + if not Options.is_install:return False + if not path:return False + if not env:env=self.env + node=self.path + destpath=self.get_install_path(path,env) + dir,name=os.path.split(destpath) + Utils.check_dir(dir) + if not os.path.isabs(srcfile): + filenode=node.find_resource(srcfile) + src=filenode.abspath(env) + else: + src=srcfile + return self.do_install(src,destpath,chmod) + def symlink_as(self,path,src,env=None): + if not Options.is_install:return + if not path:return + tgt=self.get_install_path(path,env) + dir,name=os.path.split(tgt) + Utils.check_dir(dir) + if Options.commands['install']: + try: + if not os.path.islink(tgt)or os.readlink(tgt)!=src: + info("* symlink %s (-> %s)"%(tgt,src)) + os.symlink(src,tgt) + return 0 + except OSError: + return 1 + elif Options.commands['uninstall']: + try: + info("* removing %s"%(tgt)) + os.remove(tgt) + return 0 + except OSError: + return 1 + --- /dev/null +++ radare-1.5.2/wafadmin/Constants.py @@ -0,0 +1,41 @@ +#! /usr/bin/env python +# encoding: utf-8 + +HEXVERSION=0x10409 +WAFVERSION="1.4.9" +ABI=6 +CACHE_DIR='c4che' +CACHE_SUFFIX='.cache.py' +DBFILE='.wafpickle-%d'%ABI +WSCRIPT_FILE='wscript' +WSCRIPT_BUILD_FILE='wscript_build' +WAF_CONFIG_LOG='config.log' +WAF_CONFIG_H='config.h' +COMMON_INCLUDES='COMMON_INCLUDES' +SIG_NIL='iluvcuteoverload' +VARIANT='_VARIANT_' +DEFAULT='default' +SRCDIR='srcdir' +BLDDIR='blddir' +APPNAME='APPNAME' +VERSION='VERSION' +DEFINES='defines' +UNDEFINED='#undefined#variable#for#defines#' +BREAK="break" +CONTINUE="continue" +JOBCONTROL="JOBCONTROL" +MAXPARALLEL="MAXPARALLEL" +NORMAL="NORMAL" +NOT_RUN=0 +MISSING=1 +CRASHED=2 +EXCEPTION=3 +SKIPPED=8 +SUCCESS=9 +ASK_LATER=-1 +SKIP_ME=-2 +RUN_ME=-3 +LOG_FORMAT="%(asctime)s %(c1)s%(zone)s%(c2)s %(message)s" +HOUR_FORMAT="%H:%M:%S" +TEST_OK=True + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/__init__.py @@ -0,0 +1,4 @@ +#! /usr/bin/env python +# encoding: utf-8 + + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/libtool.py @@ -0,0 +1,244 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import sys,re,os,optparse +import TaskGen,Task,Utils,preproc +from Logs import error,debug,warn +from TaskGen import taskgen,after,before,feature +REVISION="0.1.3" +fakelibtool_vardeps=['CXX','PREFIX'] +def fakelibtool_build(task): + env=task.env + dest=open(task.outputs[0].abspath(env),'w') + sname=task.inputs[0].name + fu=dest.write + fu("# Generated by ltmain.sh - GNU libtool 1.5.18 - (pwn3d by BKsys II code name WAF)\n") + if env['vnum']: + nums=env['vnum'].split('.') + libname=task.inputs[0].name + name3=libname+'.'+env['vnum'] + name2=libname+'.'+nums[0] + name1=libname + fu("dlname='%s'\n"%name2) + strn=" ".join([name3,name2,name1]) + fu("library_names='%s'\n"%(strn)) + else: + fu("dlname='%s'\n"%sname) + fu("library_names='%s %s %s'\n"%(sname,sname,sname)) + fu("old_library=''\n") + vars=' '.join(env['libtoolvars']+env['LINKFLAGS']) + fu("dependency_libs='%s'\n"%vars) + fu("current=0\n") + fu("age=0\nrevision=0\ninstalled=yes\nshouldnotlink=no\n") + fu("dlopen=''\ndlpreopen=''\n") + fu("libdir='%s/lib'\n"%env['PREFIX']) + dest.close() + return 0 +def read_la_file(path): + sp=re.compile(r'^([^=]+)=\'(.*)\'$') + dc={} + file=open(path,"r") + for line in file.readlines(): + try: + _,left,right,_=sp.split(line.strip()) + dc[left]=right + except ValueError: + pass + file.close() + return dc +def apply_link_libtool(self): + if self.type!='program': + linktask=self.link_task + latask=self.create_task('fakelibtool') + latask.set_inputs(linktask.outputs) + latask.set_outputs(linktask.outputs[0].change_ext('.la')) + self.latask=latask + if Options.commands['install']or Options.commands['uninstall']: + Build.bld.install_files('PREFIX','lib',linktask.outputs[0].abspath(self.env),self.env) +def apply_libtool(self): + self.env['vnum']=self.vnum + paths=[] + libs=[] + libtool_files=[] + libtool_vars=[] + for l in self.env['LINKFLAGS']: + if l[:2]=='-L': + paths.append(l[2:]) + elif l[:2]=='-l': + libs.append(l[2:]) + for l in libs: + for p in paths: + dict=read_la_file(p+'/lib'+l+'.la') + linkflags2=dict.get('dependency_libs','') + for v in linkflags2.split(): + if v.endswith('.la'): + libtool_files.append(v) + libtool_vars.append(v) + continue + self.env.append_unique('LINKFLAGS',v) + break + self.env['libtoolvars']=libtool_vars + while libtool_files: + file=libtool_files.pop() + dict=read_la_file(file) + for v in dict['dependency_libs'].split(): + if v[-3:]=='.la': + libtool_files.append(v) + continue + self.env.append_unique('LINKFLAGS',v) +Task.task_type_from_func('fakelibtool',vars=fakelibtool_vardeps,func=fakelibtool_build,color='BLUE',after="cc_link cxx_link ar_link_static") +class libtool_la_file: + def __init__(self,la_filename): + self.__la_filename=la_filename + self.linkname=str(os.path.split(la_filename)[-1])[:-3] + if self.linkname.startswith("lib"): + self.linkname=self.linkname[3:] + self.dlname=None + self.library_names=None + self.old_library=None + self.dependency_libs=None + self.current=None + self.age=None + self.revision=None + self.installed=None + self.shouldnotlink=None + self.dlopen=None + self.dlpreopen=None + self.libdir='/usr/lib' + if not self.__parse(): + raise"file %s not found!!"%(la_filename) + def __parse(self): + if not os.path.isfile(self.__la_filename):return 0 + la_file=open(self.__la_filename,'r') + for line in la_file: + ln=line.strip() + if not ln:continue + if ln[0]=='#':continue + (key,value)=str(ln).split('=',1) + key=key.strip() + value=value.strip() + if value=="no":value=False + elif value=="yes":value=True + else: + try:value=int(value) + except ValueError:value=value.strip("'") + setattr(self,key,value) + la_file.close() + return 1 + def get_libs(self): + libs=[] + if self.dependency_libs: + libs=str(self.dependency_libs).strip().split() + if libs==None: + libs=[] + libs.insert(0,"-l%s"%self.linkname.strip()) + libs.insert(0,"-L%s"%self.libdir.strip()) + return libs + def __str__(self): + return'''\ +dlname = "%(dlname)s" +library_names = "%(library_names)s" +old_library = "%(old_library)s" +dependency_libs = "%(dependency_libs)s" +version = %(current)s.%(age)s.%(revision)s +installed = "%(installed)s" +shouldnotlink = "%(shouldnotlink)s" +dlopen = "%(dlopen)s" +dlpreopen = "%(dlpreopen)s" +libdir = "%(libdir)s"'''%self.__dict__ +class libtool_config: + def __init__(self,la_filename): + self.__libtool_la_file=libtool_la_file(la_filename) + tmp=self.__libtool_la_file + self.__version=[int(tmp.current),int(tmp.age),int(tmp.revision)] + self.__sub_la_files=[] + self.__sub_la_files.append(la_filename) + self.__libs=None + def __cmp__(self,other): + if not other: + return 1 + othervers=[int(s)for s in str(other).split(".")] + selfvers=self.__version + return cmp(selfvers,othervers) + def __str__(self): + return"\n".join([str(self.__libtool_la_file),' '.join(self.__libtool_la_file.get_libs()),'* New getlibs:',' '.join(self.get_libs())]) + def __get_la_libs(self,la_filename): + return libtool_la_file(la_filename).get_libs() + def get_libs(self): + libs_list=list(self.__libtool_la_file.get_libs()) + libs_map={} + while len(libs_list)>0: + entry=libs_list.pop(0) + if entry: + if str(entry).endswith(".la"): + if entry not in self.__sub_la_files: + self.__sub_la_files.append(entry) + libs_list.extend(self.__get_la_libs(entry)) + else: + libs_map[entry]=1 + self.__libs=libs_map.keys() + return self.__libs + def get_libs_only_L(self): + if not self.__libs:self.get_libs() + libs=self.__libs + libs=filter(lambda s:str(s).startswith('-L'),libs) + return libs + def get_libs_only_l(self): + if not self.__libs:self.get_libs() + libs=self.__libs + libs=filter(lambda s:str(s).startswith('-l'),libs) + return libs + def get_libs_only_other(self): + if not self.__libs:self.get_libs() + libs=self.__libs + libs=filter(lambda s:not(str(s).startswith('-L')or str(s).startswith('-l')),libs) + return libs +def useCmdLine(): + usage='''Usage: %prog [options] PathToFile.la +example: %prog --atleast-version=2.0.0 /usr/lib/libIlmImf.la +nor: %prog --libs /usr/lib/libamarok.la''' + parser=optparse.OptionParser(usage) + a=parser.add_option + a("--version",dest="versionNumber",action="store_true",default=False,help="output version of libtool-config") + a("--debug",dest="debug",action="store_true",default=False,help="enable debug") + a("--libs",dest="libs",action="store_true",default=False,help="output all linker flags") + a("--libs-only-l",dest="libs_only_l",action="store_true",default=False,help="output -l flags") + a("--libs-only-L",dest="libs_only_L",action="store_true",default=False,help="output -L flags") + a("--libs-only-other",dest="libs_only_other",action="store_true",default=False,help="output other libs (e.g. -pthread)") + a("--atleast-version",dest="atleast_version",default=None,help="return 0 if the module is at least version ATLEAST_VERSION") + a("--exact-version",dest="exact_version",default=None,help="return 0 if the module is exactly version EXACT_VERSION") + a("--max-version",dest="max_version",default=None,help="return 0 if the module is at no newer than version MAX_VERSION") + (options,args)=parser.parse_args() + if len(args)!=1 and not options.versionNumber: + parser.error("incorrect number of arguments") + if options.versionNumber: + print"libtool-config version %s"%REVISION + return 0 + ltf=libtool_config(args[0]) + if options.debug: + print(ltf) + if options.atleast_version: + if ltf>=options.atleast_version:return 0 + sys.exit(1) + if options.exact_version: + if ltf==options.exact_version:return 0 + sys.exit(1) + if options.max_version: + if ltf<=options.max_version:return 0 + sys.exit(1) + def p(x): + print" ".join(x) + if options.libs:p(ltf.get_libs()) + elif options.libs_only_l:p(ltf.get_libs_only_l()) + elif options.libs_only_L:p(ltf.get_libs_only_L()) + elif options.libs_only_other:p(ltf.get_libs_only_other()) + return 0 +if __name__=='__main__': + useCmdLine() + +taskgen(apply_link_libtool) +feature("libtool")(apply_link_libtool) +after('apply_link')(apply_link_libtool) +taskgen(apply_libtool) +feature("libtool")(apply_libtool) +before('apply_core')(apply_libtool) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/checks.py @@ -0,0 +1,214 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import Utils,Configure,config_c +from Configure import conf +endian_str=''' +#include +int is_big_endian() +{ + long one = 1; + return !(*((char *)(&one))); +} +int main() +{ + if (is_big_endian()) printf("bigendian=1\\n"); + else printf("bigendian=0\\n"); + return 0; +} +''' +class compile_configurator(config_c.configurator_base): + def __init__(self,conf): + config_c.configurator_base.__init__(self,conf) + self.name='' + self.code='' + self.flags='' + self.define='' + self.uselib='' + self.want_message=0 + self.msg='' + self.force_compiler=None + def error(self): + raise Configure.ConfigurationError('test program would not run') + def run_test(self): + obj=config_c.check_data() + obj.code=self.code + obj.env=self.env + obj.uselib=self.uselib + obj.flags=self.flags + if self.force_compiler:obj.force_compiler=self.force_compiler + ret=self.conf.run_check(obj) + if self.want_message: + self.conf.check_message('compile code','',not(ret is False),option=self.msg) + return ret +def create_compile_configurator(self): + return compile_configurator(self) +def checkEndian(self,define='',pathlst=[]): + if define=='':define='IS_BIGENDIAN' + if self.is_defined(define):return self.get_define(define) + global endian + test=self.create_test_configurator() + test.code=endian_str + code=test.run()['result'] + t=Utils.to_hashtable(code) + try: + is_big=int(t['bigendian']) + except KeyError: + raise Configure.ConfigurationError('endian test failed '+code) + if is_big:strbig='big endian' + else:strbig='little endian' + self.check_message_custom('endianness','',strbig) + self.define_cond(define,is_big) + return is_big +features_str=''' +#include +int is_big_endian() +{ + long one = 1; + return !(*((char *)(&one))); +} +int main() +{ + if (is_big_endian()) printf("bigendian=1\\n"); + else printf("bigendian=0\\n"); + printf("int_size=%d\\n", sizeof(int)); + printf("long_int_size=%d\\n", sizeof(long int)); + printf("long_long_int_size=%d\\n", sizeof(long long int)); + printf("double_size=%d\\n", sizeof(double)); + return 0; +} +''' +def checkFeatures(self,lst=[],pathlst=[]): + global endian + test=self.create_test_configurator() + test.code=features_str + code=test.run()['result'] + t=Utils.to_hashtable(code) + try: + is_big=int(t['bigendian']) + except KeyError: + raise Configure.ConfigurationError('endian test failed '+code) + if is_big:strbig='big endian' + else:strbig='little endian' + self.check_message_custom('endianness','',strbig) + self.check_message_custom('int size','',t['int_size']) + self.check_message_custom('long int size','',t['long_int_size']) + self.check_message_custom('long long int size','',t['long_long_int_size']) + self.check_message_custom('double size','',t['double_size']) + self.define_cond('IS_BIGENDIAN',is_big) + self.define_cond('INT_SIZE',int(t['int_size'])) + self.define_cond('LONG_INT_SIZE',int(t['long_int_size'])) + self.define_cond('LONG_LONG_INT_SIZE',int(t['long_long_int_size'])) + self.define_cond('DOUBLE_SIZE',int(t['double_size'])) + return is_big +def detect_platform(self): + import os,sys + if os.name=='posix': + if sys.platform=='cygwin': + return'cygwin' + if str.find(sys.platform,'linux')!=-1: + return'linux' + if str.find(sys.platform,'irix')!=-1: + return'irix' + if str.find(sys.platform,'sunos')!=-1: + return'sunos' + if str.find(sys.platform,'hp-ux')!=-1: + return'hpux' + if str.find(sys.platform,'aix')!=-1: + return'aix' + if str.find(sys.platform,'darwin')!=-1: + return'darwin' + return'posix' + elif os.name=='os2': + return'os2' + elif os.name=='java': + return'java' + else: + return sys.platform +def find_header(self,header,define='',paths=''): + if not define: + define=self.have_define(header) + test=self.create_header_enumerator() + test.mandatory=1 + test.name=header + test.path=paths + test.define=define + return test.run() +def check_header(self,header,define='',mandatory=0): + if not define: + define=self.have_define(header) + test=self.create_header_configurator() + test.name=header + test.define=define + test.mandatory=mandatory + return test.run() +def try_build_and_exec(self,code,uselib=''): + test=self.create_test_configurator() + test.uselib=uselib + test.code=code + ret=test.run() + if ret:return ret['result'] + return None +def try_build(self,code,uselib='',msg='',force_compiler=''): + test=self.create_compile_configurator() + test.uselib=uselib + test.code=code + if force_compiler: + test.force_compiler=force_compiler + if msg: + test.want_message=1 + test.msg=msg + ret=test.run() + return ret +def check_flags(self,flags,kind='cc',show_msg=1): + test=self.create_test_configurator() + test.code='int main() {return 0;}\n' + test.force_compiler=kind + test.env=self.env.copy() + test.env['CPPFLAGS']=flags + ret=test.run() + if show_msg:self.check_message('flags',flags,not(ret is False)) + if ret:return 1 + return None +def check_header2(self,name,mandatory=1,define=''): + import os + ck_hdr=self.create_header_configurator() + if define:ck_hdr.define=define + else:ck_hdr.define=self.have_define(os.path.basename(name)) + ck_hdr.mandatory=mandatory + ck_hdr.name=name + return ck_hdr.run() +def check_library2(self,name,mandatory=1,uselib=''): + ck_lib=self.create_library_configurator() + if uselib:ck_lib.uselib=uselib + ck_lib.mandatory=mandatory + ck_lib.name=name + return ck_lib.run() +def check_pkg2(self,name,version,mandatory=1,uselib=''): + ck_pkg=self.create_pkgconfig_configurator() + if uselib:ck_pkg.uselib=uselib + ck_pkg.mandatory=mandatory + ck_pkg.version=version + ck_pkg.name=name + return ck_pkg.run() +def check_cfg2(self,name,mandatory=1,define='',uselib=''): + ck_cfg=self.create_cfgtool_configurator() + if uselib:ck_cfg.uselib=uselib + else:ck_cfg.uselib=name.upper() + ck_cfg.mandatory=mandatory + ck_cfg.binary=name+'-config' + return ck_cfg.run() + +conf(create_compile_configurator) +conf(checkEndian) +conf(checkFeatures) +conf(detect_platform) +conf(find_header) +conf(check_header) +conf(try_build_and_exec) +conf(try_build) +conf(check_flags) +conf(check_header2) +conf(check_library2) +conf(check_pkg2) +conf(check_cfg2) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/perl.py @@ -0,0 +1,78 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os +import pproc +import Task,Options +from Configure import conf +from TaskGen import extension,taskgen,feature,before +xsubpp_str='${PERL} ${XSUBPP} -noprototypes -typemap ${EXTUTILS_TYPEMAP} ${SRC} > ${TGT}' +EXT_XS=['.xs'] +def init_pyext(self): + self.uselib=self.to_list(self.uselib) + if not'PERL'in self.uselib:self.uselib.append('PERL') + if not'PERLEXT'in self.uselib:self.uselib.append('PERLEXT') + self.env['shlib_PATTERN']=self.env['perlext_PATTERN'] +def xsubpp_file(self,node): + gentask=self.create_task('xsubpp') + gentask.set_inputs(node) + outnode=node.change_ext('.c') + gentask.set_outputs(outnode) + self.allnodes.append(outnode) +Task.simple_task_type('xsubpp',xsubpp_str,color='BLUE',before="cc cxx") +def check_perl_version(conf,minver=None): + res=True + if not getattr(Options.options,'perlbinary',None): + perl=conf.find_program("perl",var="PERL") + if not perl: + return False + else: + perl=Options.options.perlbinary + conf.env['PERL']=perl + version=Utils.cmd_output(perl+" -e'printf \"%vd\", $^V'") + if not version: + res=False + version="Unknown" + elif not minver is None: + ver=tuple(map(int,version.split("."))) + if ver1:self.type=k[1] + else:self.type='' + if self.type: + self.features.append('c'+self.type) + self.includes='' + self.defines='' + self.rpaths='' + self.uselib='' + self.uselib_local='' + self.add_objects='' + self.p_flag_vars=[] + self.p_type_vars=[] + self.scanner_defines={} + self.compiled_tasks=[] + self.link_task=None + def clone(self,env): + new_obj=TaskGen.task_gen.clone(self,env) + variant='_'+self.env.variant() + if self.name:new_obj.name=self.name+variant + else:new_obj.name=self.target+variant + new_obj.uselib_local=[x+variant for x in Utils.to_list(self.uselib_local)] + return new_obj +def get_target_name(self): + name=self.target + pattern=self.env[self.type+'_PATTERN'] + if not pattern:pattern='%s' + k=name.rfind('/') + return name[0:k+1]+pattern%name[k+1:] +def apply_verif(self): + if not'objects'in self.features: + if not self.source: + raise Utils.WafError('no source files specified for %s'%self) + if not self.target: + raise Utils.WafError('no target for %s'%self) +def install_shlib(self): + nums=self.vnum.split('.') + path=self.install_path + libname=self.outputs[0].name + name3=libname+'.'+self.vnum + name2=libname+'.'+nums[0] + name1=libname + filename=self.outputs[0].abspath(self.env) + bld=Build.bld + bld.install_as(os.path.join(path,name3),filename,env=self.env) + bld.symlink_as(os.path.join(path,name2),name3) + bld.symlink_as(os.path.join(path,name1),name3) +def vars_target_cprogram(self): + self.default_install_path='${PREFIX}/bin' +def vars_target_cstaticlib(self): + self.default_install_path='${PREFIX}/lib' +def vars_target_cshlib(self): + self.default_install_path='${PREFIX}/lib' +def install_target_cprogram(self): + if not Options.is_install:return + try:mode=self.program_chmod + except AttributeError:mode=0755 + self.link_task.install_path=self.install_path + self.link_task.chmod=mode +def install_target_cstaticlib(self): + if not Options.is_install:return + self.link_task.install_path=self.install_path +def install_target_cshlib(self): + if not Options.is_install:return + tsk=self.link_task + self.link_task.install_path=self.install_path + if getattr(self,'vnum','')and sys.platform!='win32': + tsk.vnum=self.vnum + tsk.install=install_shlib +def apply_incpaths(self): + lst=[] + for lib in self.to_list(self.uselib): + for path in self.env['CPPPATH_'+lib]: + if not path in lst: + lst.append(path) + if preproc.go_absolute: + for path in preproc.standard_includes: + if not path in lst: + lst.append(path) + for path in self.to_list(self.includes): + if not path in lst: + lst.append(path) + if(not preproc.go_absolute)and os.path.isabs(path): + self.env.prepend_value('CPPPATH',path) + tree=Build.bld + inc_lst=[] + for path in lst: + node=None + if os.path.isabs(path): + if preproc.go_absolute: + node=Build.bld.root.find_dir(path) + else: + node=self.path.find_dir(path) + if node: + inc_lst.append(node) + self.env['INC_PATHS']=self.env['INC_PATHS']+inc_lst +def apply_type_vars(self): + st=self.env[self.type+'_USELIB'] + if st:self.uselib=self.uselib+' '+st + for var in self.p_type_vars: + compvar='_'.join([self.type,var]) + value=self.env[compvar] + if value:self.env.append_value(var,value) +def apply_link(self): + link=getattr(self,'link',None) + if not link: + if'cstaticlib'in self.features:link='ar_link_static' + elif'cxx'in self.features:link='cxx_link' + else:link='cc_link' + linktask=self.create_task(link) + outputs=[t.outputs[0]for t in self.compiled_tasks] + linktask.set_inputs(outputs) + linktask.set_outputs(self.path.find_or_declare(get_target_name(self))) + self.link_task=linktask +def apply_lib_vars(self): + env=self.env + uselib=self.to_list(self.uselib) + seen=[] + names=[]+self.to_list(self.uselib_local) + while names: + x=names.pop(0) + if x in seen: + continue + y=Build.bld.name_to_obj(x) + if not y: + raise Utils.WafError("object '%s' was not found in uselib_local (required by '%s')"%(x,self.name)) + if y.uselib_local: + lst=y.to_list(y.uselib_local) + for u in lst: + if not u in seen: + names.append(u) + y.post() + seen.append(x) + if'cshlib'in y.features: + env.append_value('LIB',y.target) + elif'cstaticlib'in y.features: + env.append_value('STATICLIB',y.target) + tmp_path=y.path.bldpath(self.env) + if not tmp_path in env['LIBPATH']:env.prepend_value('LIBPATH',tmp_path) + if y.link_task is not None: + self.link_task.set_run_after(y.link_task) + dep_nodes=getattr(self.link_task,'dep_nodes',[]) + self.link_task.dep_nodes=dep_nodes+y.link_task.outputs + morelibs=y.to_list(y.uselib) + for v in morelibs: + if v in uselib:continue + uselib=[v]+uselib + if getattr(y,'export_incdirs',None): + cpppath_st=self.env['CPPPATH_ST'] + app=self.env.append_unique + for x in self.to_list(y.export_incdirs): + node=y.path.find_dir(x) + if not node:raise Utils.WafError('object %s: invalid folder %s in export_incdirs'%(y.target,x)) + if not node in self.env['INC_PATHS']:self.env['INC_PATHS'].append(node) + for x in uselib: + for v in self.p_flag_vars: + val=self.env[v+'_'+x] + if val:self.env.append_value(v,val) +def apply_objdeps(self): + seen=[] + names=self.to_list(self.add_objects) + while names: + x=names[0] + if x in seen: + names=names[1:] + continue + y=Build.bld.name_to_obj(x) + if not y: + error('object not found in add_objects: obj %s add_objects %s'%(self.name,x)) + names=names[1:] + continue + if y.add_objects: + added=0 + lst=y.to_list(y.add_objects) + lst.reverse() + for u in lst: + if u in seen:continue + added=1 + names=[u]+names + if added:continue + y.post() + seen.append(x) + self.link_task.inputs+=y.out_nodes +def apply_obj_vars(self): + lib_st=self.env['LIB_ST'] + staticlib_st=self.env['STATICLIB_ST'] + libpath_st=self.env['LIBPATH_ST'] + staticlibpath_st=self.env['STATICLIBPATH_ST'] + app=self.env.append_unique + if self.env['FULLSTATIC']: + self.env.append_value('LINKFLAGS',self.env['FULLSTATIC_MARKER']) + for i in self.env['RPATH']: + app('LINKFLAGS',i) + for i in self.env['LIBPATH']: + app('LINKFLAGS',libpath_st%i) + for i in self.env['LIBPATH']: + app('LINKFLAGS',staticlibpath_st%i) + if self.env['STATICLIB']: + self.env.append_value('LINKFLAGS',self.env['STATICLIB_MARKER']) + k=[(staticlib_st%i)for i in self.env['STATICLIB']] + app('LINKFLAGS',k) + if not self.env['FULLSTATIC']: + if self.env['STATICLIB']or self.env['LIB']: + self.env.append_value('LINKFLAGS',self.env['SHLIB_MARKER']) + app('LINKFLAGS',[lib_st%i for i in self.env['LIB']]) +def apply_vnum(self): + try:vnum=self.vnum + except AttributeError:return + if sys.platform!='darwin'and sys.platform!='win32': + nums=self.vnum.split('.') + try:name3=self.soname + except AttributeError:name3=self.link_task.outputs[0].name+'.'+self.vnum.split('.')[0] + self.env.append_value('LINKFLAGS','-Wl,-h,'+name3) +def process_obj_files(self): + if not hasattr(self,'obj_files'):return + for x in self.obj_files: + node=self.path.find_resource(x) + self.link_task.inputs.append(node) +def add_obj_file(self,file): + if not hasattr(self,'obj_files'):self.obj_files=[] + if not'process_obj_files'in self.meths:self.meths.append('process_obj_files') + self.obj_files.append(file) +def make_objects_available(self): + self.out_nodes=[] + app=self.out_nodes.append + for t in self.compiled_tasks:app(t.outputs[0]) +c_attrs={'cxxflag':'CXXFLAGS','cflag':'CCFLAGS','ccflag':'CCFLAGS','linkflag':'LINKFLAGS','ldflag':'LINKFLAGS','rpath':'RPATH',} +def add_extra_flags(self): + for x in self.__dict__.keys(): + y=x.lower() + if y[-1]=='s': + y=y[:-1] + if c_attrs.get(y,None): + self.env.append_unique(c_attrs[y],getattr(self,x)) + +taskgen(apply_verif) +taskgen(vars_target_cprogram) +feature('cprogram','dprogram')(vars_target_cprogram) +before('apply_core')(vars_target_cprogram) +taskgen(vars_target_cstaticlib) +feature('cstaticlib','dstaticlib')(vars_target_cstaticlib) +before('apply_core')(vars_target_cstaticlib) +taskgen(vars_target_cshlib) +feature('cshlib','dshlib')(vars_target_cshlib) +before('apply_core')(vars_target_cshlib) +taskgen(install_target_cprogram) +feature('cprogram','dprogram')(install_target_cprogram) +after('apply_objdeps')(install_target_cprogram) +taskgen(install_target_cstaticlib) +feature('cstaticlib','dstaticlib')(install_target_cstaticlib) +after('apply_objdeps')(install_target_cstaticlib) +taskgen(install_target_cshlib) +feature('cshlib','dshlib')(install_target_cshlib) +after('apply_objdeps')(install_target_cshlib) +taskgen(apply_incpaths) +after('apply_type_vars')(apply_incpaths) +taskgen(apply_type_vars) +taskgen(apply_link) +feature('cprogram','cshlib','cstaticlib')(apply_link) +after('apply_core')(apply_link) +taskgen(apply_lib_vars) +after('apply_vnum')(apply_lib_vars) +taskgen(apply_objdeps) +feature('cprogram','cshlib','cstaticlib')(apply_objdeps) +after('apply_obj_vars')(apply_objdeps) +after('apply_vnum')(apply_objdeps) +taskgen(apply_obj_vars) +feature('cprogram','cshlib','cstaticlib')(apply_obj_vars) +after('apply_lib_vars')(apply_obj_vars) +taskgen(apply_vnum) +feature('cprogram','cshlib','cstaticlib')(apply_vnum) +after('apply_link')(apply_vnum) +taskgen(process_obj_files) +after('apply_link')(process_obj_files) +taskgen(add_obj_file) +taskgen(make_objects_available) +feature('objects')(make_objects_available) +after('apply_core')(make_objects_available) +taskgen(add_extra_flags) +feature('cc','cxx')(add_extra_flags) +before('init_cxx')(add_extra_flags) +before('init_cc')(add_extra_flags) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/boost.py @@ -0,0 +1,193 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import os.path,glob,types,re,sys +import Configure,config_c,Options,Utils +from Logs import warn +from Configure import conf +class boost_configurator(config_c.configurator_base): + STATIC_NOSTATIC='nostatic' + STATIC_BOTH='both' + STATIC_ONLYSTATIC='onlystatic' + def __init__(self,conf): + config_c.configurator_base.__init__(self,conf) + (self.min_version,self.max_version,self.version)=('','','') + self.lib_path=['/usr/lib','/usr/local/lib','/opt/local/lib','/sw/lib','/lib'] + self.include_path=['/usr/include','/usr/local/include','/opt/local/include','/sw/include'] + self.lib='' + (self.threadingtag,self.abitag,self.versiontag,self.toolsettag)=(None,'^[^d]*$',None,None) + self.tagscores={'threading':(10,-10),'abi':(10,-10),'toolset':(1,-1),'version':(100,-100)} + self.min_score=0 + self.static=boost_configurator.STATIC_NOSTATIC + self.conf=conf + self.found_includes=0 + def run_cache(self,retval):pass + def validate(self): + if self.version: + self.min_version=self.max_version=self.version + def get_boost_version_number(self,dir): + test_obj=Configure.check_data() + test_obj.code=''' +#include +#include +int main() { std::cout << BOOST_VERSION << std::endl; } +''' + test_obj.env=self.conf.env.copy() + test_obj.env['CPPPATH']=[dir] + test_obj.execute=1 + test_obj.force_compiler='cxx' + ret=self.conf.run_check(test_obj) + if ret: + return int(ret['result']) + else: + return-1 + def string_to_version(self,str): + version=str.split('.') + return int(version[0])*100000+int(version[1])*100+int(version[2]) + def version_string(self,version): + major=version/100000 + minor=version/100%1000 + minor_minor=version%100 + if minor_minor==0: + return"%d_%d"%(major,minor) + else: + return"%d_%d_%d"%(major,minor,minor_minor) + def find_includes(self): + env=self.conf.env + guess=[] + boostPath=getattr(Options.options,'boostincludes','') + if not boostPath: + if self.include_path is types.StringType: + include_paths=[self.include_path] + else: + include_paths=self.include_path + else: + include_paths=[os.path.normpath(os.path.expandvars(os.path.expanduser(boostPath)))] + min_version=0 + if self.min_version: + min_version=self.string_to_version(self.min_version) + max_version=sys.maxint + if self.max_version: + max_version=self.string_to_version(self.max_version) + version=0 + boost_path='' + for include_path in include_paths: + boost_paths=glob.glob(os.path.join(include_path,'boost*')) + for path in boost_paths: + pathname=os.path.split(path)[-1] + ret=-1 + if pathname=='boost': + path=include_path + ret=self.get_boost_version_number(path) + elif pathname.startswith('boost-'): + ret=self.get_boost_version_number(path) + if ret!=-1 and ret>=min_version and ret<=max_version and ret>version: + boost_path=path + version=ret + if version==0 or len(boost_path)==0: + conf.fatal('boost headers not found! (required version min: %s max: %s)'%(self.min_version,self.max_version)) + return 0 + found_version=self.version_string(version) + versiontag='^'+found_version+'$' + if self.versiontag is None: + self.versiontag=versiontag + elif self.versiontag!=versiontag: + warn('boost header version and versiontag do _not_ match!') + self.conf.check_message('header','boost',1,'Version '+found_version+' ('+boost_path+')') + env['CPPPATH_BOOST']=boost_path + env['BOOST_VERSION']=found_version + self.found_includes=1 + def get_toolset(self): + v=self.conf.env + toolset=v['CXX_NAME'] + if v['CXX_VERSION']: + version_no=v['CXX_VERSION'].split('.') + toolset+=version_no[0] + if len(version_no)>1: + toolset+=version_no[1] + return toolset + def tags_score(self,tags): + is_versiontag=re.compile('^\d+_\d+_?\d*$') + is_threadingtag=re.compile('^mt$') + is_abitag=re.compile('^[sgydpn]+$') + is_toolsettag=re.compile('^(acc|borland|como|cw|dmc|darwin|gcc|hp_cxx|intel|kylix|msvc|qcc|sun|vacpp)\d*$') + score=0 + needed_tags={'threading':self.threadingtag,'abi':self.abitag,'toolset':self.toolsettag,'version':self.versiontag} + if self.toolsettag is None: + needed_tags['toolset']=self.get_toolset() + found_tags={} + for tag in tags: + if is_versiontag.match(tag):found_tags['version']=tag + if is_threadingtag.match(tag):found_tags['threading']=tag + if is_abitag.match(tag):found_tags['abi']=tag + if is_toolsettag.match(tag):found_tags['toolset']=tag + for tagname in needed_tags.iterkeys(): + if needed_tags[tagname]is not None and found_tags.has_key(tagname): + if re.compile(needed_tags[tagname]).match(found_tags[tagname]): + score+=self.tagscores[tagname][0] + else: + score+=self.tagscores[tagname][1] + return score + def libfiles(self,lib,pattern,lib_paths): + result=[] + for lib_path in lib_paths: + libname=pattern%('boost_'+lib+'*') + result+=glob.glob(lib_path+'/'+libname) + return result + def find_library_from_list(self,lib,files): + lib_pattern=re.compile('.*boost_(.*?)\..*') + result=(None,None) + resultscore=self.min_score-1 + for file in files: + m=lib_pattern.search(file,1) + if m: + libname=m.group(1) + libtags=libname.split('-')[1:] + currentscore=self.tags_score(libtags) + if currentscore>resultscore: + result=(libname,file) + resultscore=currentscore + return result + def find_library(self,lib): + boostPath=getattr(Options.options,'boostlibs','') + if not boostPath: + if self.lib_path is types.StringType: + lib_paths=[self.lib_path] + else: + lib_paths=self.lib_path + else: + lib_paths=[os.path.normpath(os.path.expandvars(os.path.expanduser(boostPath)))] + (libname,file)=(None,None) + if self.static in[boost_configurator.STATIC_NOSTATIC,boost_configurator.STATIC_BOTH]: + st_env_prefix='LIB' + files=self.libfiles(lib,self.conf.env['shlib_PATTERN'],lib_paths) + (libname,file)=self.find_library_from_list(lib,files) + if libname is None and self.static in[boost_configurator.STATIC_ONLYSTATIC,boost_configurator.STATIC_BOTH]: + st_env_prefix='STATICLIB' + files=self.libfiles(lib,self.conf.env['staticlib_PATTERN'],lib_paths) + (libname,file)=self.find_library_from_list(lib,files) + if libname is not None: + self.conf.check_message('library','boost_'+lib,1,file) + self.conf.env['LIBPATH_BOOST_'+lib.upper()]=os.path.split(file)[0] + self.conf.env[st_env_prefix+'_BOOST_'+lib.upper()]='boost_'+libname + return + conf.fatal('lib boost_'+lib+' not found!') + def find_libraries(self): + libs_to_find=self.lib + if self.lib is types.StringType:libs_to_find=[self.lib] + for lib in libs_to_find: + self.find_library(lib) + def run_test(self): + if not self.found_includes: + self.find_includes() + self.find_libraries() +def create_boost_configurator(self): + return boost_configurator(self) +def detect(conf): + pass +def set_options(opt): + opt.add_option('--boost-includes',type='string',default='',dest='boostincludes',help='path to the boost directory where the includes are e.g. /usr/local/include/boost-1_35') + opt.add_option('--boost-libs',type='string',default='',dest='boostlibs',help='path to the directory where the boost libs are e.g. /usr/local/lib') +conf(create_boost_configurator) + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/java.py @@ -0,0 +1,109 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,re +from Configure import conf +import TaskGen,Task,Utils +class java_taskgen(TaskGen.task_gen): + def __init__(self,*k): + TaskGen.task_gen.__init__(self,*k) + self.jarname='' + self.jaropts='' + self.classpath='' + self.source_root='.' + self.jar_mf_attributes={} + self.jar_mf_classpath=[] + def apply(self): + nodes_lst=[] + if not self.classpath: + if not self.env['CLASSPATH']: + self.env['CLASSPATH']='..'+os.pathsep+'.' + else: + self.env['CLASSPATH']=self.classpath + re_foo=re.compile(self.source) + source_root_node=self.path.find_dir(self.source_root) + src_nodes=[] + bld_nodes=[] + prefix_path=source_root_node.abspath() + for(root,dirs,filenames)in os.walk(source_root_node.abspath()): + for x in filenames: + file=root+'/'+x + file=file.replace(prefix_path,'') + if file.startswith('/'): + file=file[1:] + if re_foo.search(file)>-1: + node=source_root_node.find_resource(file) + src_nodes.append(node) + node2=node.change_ext(".class") + bld_nodes.append(node2) + self.env['OUTDIR']=source_root_node.abspath(self.env) + tsk=self.create_task('javac') + tsk.set_inputs(src_nodes) + tsk.set_outputs(bld_nodes) + if self.jarname: + tsk=self.create_task('jar_create') + tsk.set_inputs(bld_nodes) + tsk.set_outputs(self.path.find_or_declare(self.jarname)) + if not self.env['JAROPTS']: + if self.jaropts: + self.env['JAROPTS']=self.jaropts + else: + dirs='.' + self.env['JAROPTS']='-C %s %s'%(self.env['OUTDIR'],dirs) +Task.simple_task_type('javac','${JAVAC} -classpath ${CLASSPATH} -d ${OUTDIR} ${SRC}',color='BLUE',before="jar_create") +Task.simple_task_type('jar_create','${JAR} ${JARCREATE} ${TGT} ${JAROPTS}',color='GREEN') +def detect(conf): + java_path=os.environ['PATH'].split(os.pathsep) + v=conf.env + if os.environ.has_key('JAVA_HOME'): + java_path=[os.path.join(os.environ['JAVA_HOME'],'bin')]+java_path + conf.env['JAVA_HOME']=os.environ['JAVA_HOME'] + conf.find_program('javac',var='JAVAC',path_list=java_path) + conf.find_program('java',var='JAVA',path_list=java_path) + conf.find_program('jar',var='JAR',path_list=java_path) + v['JAVA_EXT']=['.java'] + if os.environ.has_key('CLASSPATH'): + v['CLASSPATH']=os.environ['CLASSPATH'] + if not v['JAR']:conf.fatal('jar is required for making java packages') + if not v['JAVAC']:conf.fatal('javac is required for compiling java classes') + v['JARCREATE']='cf' +def check_java_class(conf,classname,with_classpath=None): + class_check_source=""" +public class Test { + public static void main(String[] argv) { + Class lib; + if (argv.length < 1) { + System.err.println("Missing argument"); + System.exit(77); + } + try { + lib = Class.forName(argv[0]); + } catch (ClassNotFoundException e) { + System.err.println("ClassNotFoundException"); + System.exit(1); + } + lib = null; + System.exit(0); + } +} +""" + import shutil + javatestdir='.waf-javatest' + classpath=javatestdir + if conf.env['CLASSPATH']: + classpath+=os.pathsep+conf.env['CLASSPATH'] + if isinstance(with_classpath,str): + classpath+=os.pathsep+with_classpath + shutil.rmtree(javatestdir,True) + os.mkdir(javatestdir) + java_file=open(os.path.join(javatestdir,'Test.java'),'w') + java_file.write(class_check_source) + java_file.close() + os.popen(conf.env['JAVAC']+' '+os.path.join(javatestdir,'Test.java')) + (jstdin,jstdout,jstderr)=os.popen3(conf.env['JAVA']+' -cp '+classpath+' Test '+classname) + found=not bool(jstderr.read()) + conf.check_message('Java class %s'%classname,"",found) + shutil.rmtree(javatestdir,True) + return found + +conf(check_java_class) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/kde4.py @@ -0,0 +1,58 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,re,TaskGen,Task,Utils +class msgfmt_taskgen(TaskGen.task_gen): + def __init__(self,appname='set_your_app_name'): + TaskGen.task_gen.__init__(self) + self.langs='' + self.chmod=0644 + self.default_install_path='${KDE4_LOCALE_INSTALL_DIR}' + self.appname=appname + def apply(self): + for lang in self.to_list(self.langs): + node=self.path.find_resource(lang+'.po') + task=self.create_task('msgfmt') + task.set_inputs(node) + task.set_outputs(node.change_ext('.mo')) + if not Options.is_install:continue + langname=lang.split('/') + langname=langname[-1] + tsk.install_path=self.install_path+os.sep+langname+os.sep+'LC_MESSAGES' + task.filename=self.appname+'.mo' + task.chmod=self.chmod +def detect(conf): + kdeconfig=conf.find_program('kde4-config') + if not kdeconfig: + conf.fatal('we need kde4-config') + prefix=Utils.cmd_output('%s --prefix'%kdeconfig).strip() + file='%s/share/apps/cmake/modules/KDELibsDependencies.cmake'%prefix + try:os.stat(file) + except OSError: + file='%s/share/apps/cmake/modules/KDELibsDependencies.cmake'%prefix + try:os.stat(file) + except:conf.fatal('could not open %s'%file) + try: + f=open(file,'r') + txt=f.read() + f.close() + except(OSError,IOError): + conf.fatal('could not read %s'%file) + txt=txt.replace('\\\n','\n') + fu=re.compile('#(.*)\n') + txt=fu.sub('',txt) + setregexp=re.compile('([sS][eE][tT]\s*\()\s*([^\s]+)\s+\"([^"]+)\"\)') + found=setregexp.findall(txt) + for(_,key,val)in found: + conf.env[key]=val + conf.env['LIB_KDECORE']='kdecore' + conf.env['LIB_KDEUI']='kdeui' + conf.env['LIB_KIO']='kio' + conf.env['LIB_KHTML']='khtml' + conf.env['LIB_KPARTS']='kparts' + conf.env['LIBPATH_KDECORE']=conf.env['KDE4_LIB_INSTALL_DIR'] + conf.env['CPPPATH_KDECORE']=conf.env['KDE4_INCLUDE_INSTALL_DIR'] + conf.env.append_value('CPPPATH_KDECORE',conf.env['KDE4_INCLUDE_INSTALL_DIR']+"/KDE") + conf.env['MSGFMT']=conf.find_program('msgfmt') +Task.simple_task_type('msgfmt','${MSGFMT} ${SRC} -o ${TGT}',color='BLUE') + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/config_c.py @@ -0,0 +1,741 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,types,imp,cPickle,sys,shlex,warnings +from Utils import md5 +import Build,Utils,Configure,Task,Options +from Logs import warn,debug +from Constants import* +from Configure import conf,conftest +stdincpath=['/usr/include/','/usr/local/include/'] +stdlibpath=['/usr/lib/','/usr/local/lib/','/lib'] +class attached_conf(type): + def __init__(cls,name,bases,dict): + super(attached_conf,cls).__init__(name,bases,dict) + def fun_create(self): + inst=cls(self) + return inst + setattr(Configure.ConfigurationContext,'create_'+cls.__name__,fun_create) +class enumerator_base(object): + def __init__(self,conf): + self.conf=conf + self.env=conf.env + self.define='' + self.mandatory=0 + self.message='' + def error(self): + if not self.message: + Logs.warn('No message provided') + self.conf.fatal(self.message) + def hash(self): + m=md5() + classvars=vars(self) + for(var,value)in classvars.iteritems(): + if callable(var)or value==self or value==self.env or value==self.conf: + continue + m.update(str(value)) + return m.digest() + def update_env(self,hashtable): + if not type(hashtable)is types.StringType: + for name in hashtable.keys(): + self.env.append_value(name,hashtable[name]) + def validate(self): + pass + def run(self): + self.validate() + ret=self.run_test() + if self.mandatory and not ret:self.error() + return ret + def run_test(self): + return not TEST_OK +class configurator_base(enumerator_base): + def __init__(self,conf): + enumerator_base.__init__(self,conf) + self.uselib_store='' +class function_enumerator(enumerator_base): + __metaclass__=attached_conf + def __init__(self,conf): + enumerator_base.__init__(self,conf) + self.function='' + self.define='' + self.headers=[] + self.header_code='' + self.custom_code='' + self.include_paths=[] + self.libs=[] + self.lib_paths=[] + def error(self): + errmsg='function %s cannot be found'%self.function + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.define: + self.define=self.function.upper() + def run_test(self): + ret=not TEST_OK + oldlibpath=self.env['LIBPATH'] + oldlib=self.env['LIB'] + code=[] + code.append(self.header_code) + code.append('\n') + for header in self.headers: + code.append('#include <%s>\n'%header) + if self.custom_code: + code.append('int main(){%s\nreturn 0;}\n'%self.custom_code) + else: + code.append('int main(){\nvoid *p;\np=(void*)(%s);\nreturn 0;\n}\n'%self.function) + self.env['LIB']=Utils.to_list(self.libs) + self.env['LIBPATH']=Utils.to_list(self.lib_paths) + obj=check_data() + obj.code="\n".join(code) + obj.includes=self.include_paths + obj.env=self.env + ret=int(self.conf.run_check(obj)) + self.conf.check_message('function %s'%self.function,'',ret,option='') + self.conf.define_cond(self.define,ret) + self.env['LIB']=oldlib + self.env['LIBPATH']=oldlibpath + return ret +class library_enumerator(enumerator_base): + __metaclass__=attached_conf + def __init__(self,conf): + enumerator_base.__init__(self,conf) + self.name='' + self.path=[] + self.code='int main() {return 0;}\n' + self.uselib_store='' + self.uselib='' + self.nosystem=0 + self.want_message=1 + def error(self): + errmsg='library %s cannot be found'%self.name + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.nosystem and not self.path: + self.path+=stdlibpath + def run_test(self): + ret='' + patterns=[self.env['shlib_PATTERN'],'lib%s.dll.a','lib%s.lib',self.env['staticlib_PATTERN']] + for x in patterns: + name=x%self.name + ret=Configure.find_file(name,self.path) + if ret:break + if self.want_message: + self.conf.check_message('library '+self.name,'',ret,option=ret) + if self.uselib_store: + self.env.append_value('LIB_'+self.uselib_store,self.name) + self.env.append_value('LIBPATH_'+self.uselib_store,ret) + return ret +class header_enumerator(enumerator_base): + __metaclass__=attached_conf + def __init__(self,conf): + enumerator_base.__init__(self,conf) + self.name=[] + self.path=[] + self.define=[] + self.nosystem=0 + self.want_message=1 + def validate(self): + if not self.nosystem and not self.path: + self.path=stdincpath + def error(self): + errmsg='cannot find %s in %s'%(self.name,str(self.path)) + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def run_test(self): + ret=Configure.find_file(self.name,self.path) + if self.want_message: + self.conf.check_message('header',self.name,ret,ret) + if self.define:self.env[self.define]=ret + return ret +class cfgtool_configurator(configurator_base): + __metaclass__=attached_conf + def __init__(self,conf): + configurator_base.__init__(self,conf) + self.uselib_store='' + self.define='' + self.binary='' + self.tests={} + def error(self): + errmsg='%s cannot be found'%self.binary + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.binary: + raise ValueError,"no binary given in cfgtool!" + if not self.uselib_store: + raise ValueError,"no uselib_store given in cfgtool!" + if not self.define: + self.define=self.conf.have_define(self.uselib_store) + if not self.tests: + self.tests['--cflags']='CCFLAGS' + self.tests['--cflags']='CXXFLAGS' + self.tests['--libs']='LINKFLAGS' + def run_test(self): + retval={} + found=TEST_OK + null='2>/dev/null' + if sys.platform=="win32":null='2>nul' + try: + ret=os.popen('%s %s %s'%(self.binary,self.tests.keys()[0],null)).close() + if ret:raise ValueError,"error" + for flag in self.tests: + var=self.tests[flag]+'_'+self.uselib_store + cmd='%s %s %s'%(self.binary,flag,null) + retval[var]=[Utils.cmd_output(cmd).strip()] + self.update_env(retval) + except ValueError: + retval={} + found=not TEST_OK + self.conf.define_cond(self.define,found) + self.conf.check_message('config-tool '+self.binary,'',found,option='') + return retval +class pkgconfig_configurator(configurator_base): + __metaclass__=attached_conf + def __init__(self,conf): + configurator_base.__init__(self,conf) + self.name='' + self.version='' + self.pkgpath=os.path.join(Options.options.prefix,'lib','pkgconfig') + self.uselib_store='' + self.define='' + self.binary='' + self.static=False + self.variables=[] + self.defines={} + def error(self): + if self.version: + errmsg='pkg-config cannot find %s >= %s'%(self.name,self.version) + else: + errmsg='pkg-config cannot find %s'%self.name + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.uselib_store: + self.uselib_store=self.name.upper() + if not self.define: + self.define=self.conf.have_define(self.uselib_store) + def _setup_pkg_config_path(self): + pkgpath=self.pkgpath + if not pkgpath: + return"" + if sys.platform=='win32': + if hasattr(self,'pkgpath_win32_setup'): + return"" + pkgpath_env=os.getenv('PKG_CONFIG_PATH') + if pkgpath_env: + pkgpath_env=pkgpath_env+';'+pkgpath + else: + pkgpath_env=pkgpath + os.putenv('PKG_CONFIG_PATH',pkgpath_env) + setattr(self,'pkgpath_win32_setup',True) + return"" + pkgpath='PKG_CONFIG_PATH=$PKG_CONFIG_PATH:'+pkgpath + return pkgpath + def run_test(self): + pkgbin=self.binary + uselib_store=self.uselib_store + if type(self.variables)is types.StringType: + self.variables=str(self.variables).split() + if not pkgbin: + pkgbin='pkg-config' + pkgpath=self._setup_pkg_config_path() + pkgcom='%s %s'%(pkgpath,pkgbin) + for key,val in self.defines.items(): + pkgcom+=' --define-variable=%s=%s'%(key,val) + if self.static: + pkgcom+=' --static' + g_defines=self.env['PKG_CONFIG_DEFINES'] + if type(g_defines)is types.DictType: + for key,val in g_defines.items(): + if self.defines and self.defines.has_key(key): + continue + pkgcom+=' --define-variable=%s=%s'%(key,val) + retval={} + try: + if self.version: + cmd="%s --atleast-version=%s \"%s\""%(pkgcom,self.version,self.name) + ret=os.popen(cmd).close() + debug('conf: pkg-config cmd "%s" returned %s'%(cmd,ret)) + self.conf.check_message('package %s >= %s'%(self.name,self.version),'',not ret) + if ret:raise ValueError,"error" + else: + cmd="%s \"%s\""%(pkgcom,self.name) + ret=os.popen(cmd).close() + debug('conf: pkg-config cmd "%s" returned %s'%(cmd,ret)) + self.conf.check_message('package %s'%(self.name),'',not ret) + if ret: + raise ValueError,"error" + cflags_I=shlex.split(Utils.cmd_output('%s --cflags-only-I \"%s\"'%(pkgcom,self.name))) + cflags_other=shlex.split(Utils.cmd_output('%s --cflags-only-other \"%s\"'%(pkgcom,self.name))) + retval['CCFLAGS_'+uselib_store]=cflags_other + retval['CXXFLAGS_'+uselib_store]=cflags_other + retval['CPPPATH_'+uselib_store]=[] + for incpath in cflags_I: + assert incpath[:2]=='-I'or incpath[:2]=='/I' + retval['CPPPATH_'+uselib_store].append(incpath[2:]) + static_l='' + if self.static: + static_l='STATIC' + modlibs=Utils.cmd_output('%s --libs-only-l \"%s\"'%(pkgcom,self.name)).strip().split() + retval[static_l+'LIB_'+uselib_store]=[] + for item in modlibs: + retval[static_l+'LIB_'+uselib_store].append(item[2:]) + modpaths=Utils.cmd_output('%s --libs-only-L \"%s\"'%(pkgcom,self.name)).split() + retval['LIBPATH_'+uselib_store]=[] + for item in modpaths: + retval['LIBPATH_'+uselib_store].append(item[2:]) + modother=Utils.cmd_output('%s --libs-only-other \"%s\"'%(pkgcom,self.name)).strip().split() + retval['LINKFLAGS_'+uselib_store]=[] + for item in modother: + if str(item).endswith(".la"): + import libtool + la_config=libtool.libtool_config(item) + libs_only_L=la_config.get_libs_only_L() + libs_only_l=la_config.get_libs_only_l() + for entry in libs_only_l: + retval[static_l+'LIB_'+uselib_store].append(entry[2:]) + for entry in libs_only_L: + retval['LIBPATH_'+uselib_store].append(entry[2:]) + else: + retval['LINKFLAGS_'+uselib_store].append(item) + for variable in self.variables: + var_defname='' + if(type(variable)is types.ListType): + if len(variable)==2 and variable[1]: + var_defname=variable[1] + variable=variable[0] + if not var_defname: + var_defname=uselib_store+'_'+variable.upper() + retval[var_defname]=Utils.cmd_output('%s --variable=%s \"%s\"'%(pkgcom,variable,self.name)).strip() + self.conf.define(self.define,1) + self.update_env(retval) + except ValueError: + retval={} + self.conf.undefine(self.define) + return retval +class test_configurator(configurator_base): + __metaclass__=attached_conf + def __init__(self,conf): + configurator_base.__init__(self,conf) + self.name='' + self.code='' + self.flags='' + self.define='' + self.uselib='' + self.want_message=0 + def error(self): + errmsg='test program would not run' + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.code: + raise Configure.ConfigurationError('test configurator needs code to compile and run!') + def run_test(self): + obj=check_data() + obj.code=self.code + obj.env=self.env + obj.uselib=self.uselib + obj.flags=self.flags + obj.force_compiler=getattr(self,'force_compiler',None) + obj.execute=1 + ret=self.conf.run_check(obj) + if self.want_message: + if ret:data=ret['result'] + else:data='' + self.conf.check_message('custom code','',ret,option=data) + return ret +class library_configurator(configurator_base): + __metaclass__=attached_conf + def __init__(self,conf): + configurator_base.__init__(self,conf) + self.name='' + self.path=[] + self.define='' + self.nosystem=0 + self.uselib='' + self.uselib_store='' + self.static=False + self.libs=[] + self.lib_paths=[] + self.code='int main(){return 0;}\n' + def error(self): + errmsg='library %s cannot be linked'%self.name + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.uselib_store: + self.uselib_store=self.name.upper() + if not self.define: + self.define=self.conf.have_define(self.uselib_store) + if not self.uselib_store: + self.conf.fatal('uselib_store is not defined') + if not self.code: + self.conf.fatal('library enumerator must have code to compile') + def run_test(self): + oldlibpath=self.env['LIBPATH'] + oldlib=self.env['LIB'] + static_l='' + if self.static: + static_l='STATIC' + olduselibpath=list(self.env['LIBPATH_'+self.uselib_store]) + olduselib=list(self.env[static_l+'LIB_'+self.uselib_store]) + test=self.conf.create_library_enumerator() + test.nosystem=self.nosystem + test.name=self.name + test.want_message=0 + test.path=self.path + test.env=self.env + ret=test.run() + if ret: + self.env['LIBPATH_'+self.uselib_store]+=[ret] + self.env[static_l+'LIB_'+self.uselib_store]+=[self.name] + self.env['LIB']=[self.name]+self.libs + self.env['LIBPATH']=self.lib_paths + obj=check_data() + obj.code=self.code + obj.env=self.env + obj.uselib=self.uselib_store+" "+self.uselib + obj.libpath=self.path + ret=int(self.conf.run_check(obj)) + self.conf.check_message('library %s'%self.name,'',ret) + self.conf.define_cond(self.define,ret) + val={} + if ret: + val['LIBPATH_'+self.uselib_store]=self.env['LIBPATH_'+self.uselib] + val[static_l+'LIB_'+self.uselib_store]=self.env['LIB_'+self.uselib] + val[self.define]=ret + else: + self.env['LIBPATH_'+self.uselib_store]=olduselibpath + self.env[static_l+'LIB_'+self.uselib_store]=olduselib + self.env['LIB']=oldlib + self.env['LIBPATH']=oldlibpath + return val +class framework_configurator(configurator_base): + __metaclass__=attached_conf + def __init__(self,conf): + configurator_base.__init__(self,conf) + self.name='' + self.custom_code='' + self.code='int main(){return 0;}\n' + self.define='' + self.path=[] + self.uselib='' + self.uselib_store='' + self.remove_dot_h=False + def error(self): + errmsg='framework %s cannot be found via compiler, try pass -F'%self.name + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.uselib_store: + self.uselib_store=self.name.upper() + if not self.define: + self.define=self.conf.have_define(self.uselib_store) + if not self.code: + self.code="#include <%s>\nint main(){return 0;}\n" + def run_test(self): + code=[] + if self.remove_dot_h: + code.append('#include <%s/%s>\n'%(self.name,self.name)) + else: + code.append('#include <%s/%s.h>\n'%(self.name,self.name)) + code.append('int main(){%s\nreturn 0;}\n'%self.custom_code) + for p in self.path: + linkflags=['-framework',self.name,'-F%s'%p] + ccflags=['-F%s'%p] + obj=check_data() + obj.code="\n".join(code) + obj.uselib=self.uselib_store+" "+self.uselib + obj.env=self.env.copy() + obj.env['LINKFLAGS']=linkflags + obj.env['CCFLAGS']=ccflags + obj.env['CXXFLAGS']=ccflags + ret=int(self.conf.run_check(obj)) + if ret:break + self.conf.check_message('framework %s'%self.name,'',ret,option='') + self.conf.define_cond(self.define,ret) + val={} + if ret: + val['LINKFLAGS_'+self.uselib_store]=linkflags + val['CCFLAGS_'+self.uselib_store]=ccflags + val['CXXFLAGS_'+self.uselib_store]=ccflags + val[self.define]=ret + self.update_env(val) + return val +class header_configurator(configurator_base): + __metaclass__=attached_conf + def __init__(self,conf): + configurator_base.__init__(self,conf) + self.name='' + self.path=[] + self.header_code='' + self.custom_code='' + self.code='int main() {return 0;}\n' + self.define='' + self.nosystem=0 + self.libs=[] + self.lib_paths=[] + self.uselib='' + self.uselib_store='' + def error(self): + errmsg='header %s cannot be found via compiler'%self.name + if self.message:errmsg+='\n%s'%self.message + self.conf.fatal(errmsg) + def validate(self): + if not self.define: + if self.name:self.define=self.conf.have_define(self.name) + elif self.uselib_store:self.define=self.conf.have_define(self.uselib_store) + if not self.code: + self.code="#include <%s>\nint main(){return 0;}\n" + if not self.define: + self.conf.fatal('no define given') + def run_test(self): + ret={} + if self.uselib_store: + test=self.conf.create_header_enumerator() + test.nosystem=self.nosystem + test.name=self.name + test.want_message=0 + test.path=self.path + test.env=self.env + ret=test.run() + if ret: + self.env['CPPPATH_'+self.uselib_store]=ret + code=[] + code.append(self.header_code) + code.append('\n') + code.append('#include <%s>\n'%self.name) + code.append('int main(){%s\nreturn 0;}\n'%self.custom_code) + obj=check_data() + obj.code="\n".join(code) + obj.includes=self.path + obj.uselib=self.uselib_store+" "+self.uselib + obj.env=self.conf.env.copy() + obj.env['LIB']=Utils.to_list(self.libs) + obj.env['LIBPATH']=Utils.to_list(self.lib_paths) + ret=int(self.conf.run_check(obj)) + self.conf.check_message('header %s'%self.name,'',ret,option='') + self.conf.define_cond(self.define,ret) + val={} + if ret: + val['CPPPATH_'+self.uselib_store]=self.env['CPPPATH_'+self.uselib_store] + val[self.define]=ret + if not ret:return{} + return val +class common_include_configurator(header_enumerator): + __metaclass__=attached_conf + def run_test(self): + header_dir=header_enumerator.run_test(self) + if header_dir: + header_path=os.path.join(header_dir,self.name) + self.env.append_unique(COMMON_INCLUDES,header_path) + return header_dir +class check_data(object): + def __init__(self): + self.env='' + self.code='' + self.flags='' + self.uselib='' + self.includes='' + self.function_name='' + self.lib=[] + self.libpath=[] + self.define='' + self.header_name='' + self.execute=0 + self.options='' + self.force_compiler=None + self.build_type='program' +setattr(Configure,'check_data',check_data) +def define(self,define,value,quote=1): + assert define and isinstance(define,str) + tbl=self.env[DEFINES]or Utils.ordered_dict() + if isinstance(value,str): + if quote==1: + tbl[define]='"%s"'%str(value) + else: + tbl[define]=value + elif isinstance(value,int): + tbl[define]=value + else: + raise TypeError + self.env[DEFINES]=tbl + self.env[define]=value +def undefine(self,define): + assert define and isinstance(define,str) + tbl=self.env[DEFINES]or Utils.ordered_dict() + value=UNDEFINED + tbl[define]=value + self.env[DEFINES]=tbl + self.env[define]=value +def define_cond(self,name,value): + if value: + self.define(name,1) + else: + self.undefine(name) +def is_defined(self,key): + defines=self.env[DEFINES] + if not defines: + return False + try: + value=defines[key] + except KeyError: + return False + else: + return value!=UNDEFINED +def get_define(self,define): + try:return self.env[DEFINES][define] + except KeyError:return None +def have_define(self,name): + return"HAVE_%s"%Utils.quote_define_name(name) +def write_config_header(self,configfile='',env=''): + if not configfile:configfile=WAF_CONFIG_H + lst=Utils.split_path(configfile) + base=lst[:-1] + if not env:env=self.env + base=[self.blddir,env.variant()]+base + dir=os.path.join(*base) + if not os.path.exists(dir): + os.makedirs(dir) + dir=os.path.join(dir,lst[-1]) + self.env.append_value('waf_config_files',os.path.abspath(dir)) + waf_guard='_%s_WAF'%Utils.quote_define_name(configfile) + dest=open(dir,'w') + dest.write('/* Configuration header created by Waf - do not edit */\n') + dest.write('#ifndef %s\n#define %s\n\n'%(waf_guard,waf_guard)) + if not configfile in self.env['dep_files']: + self.env['dep_files']+=[configfile] + tbl=env[DEFINES]or Utils.ordered_dict() + for key in tbl.allkeys: + value=tbl[key] + if value is None: + dest.write('#define %s\n'%key) + elif value is UNDEFINED: + dest.write('/* #undef %s */\n'%key) + else: + dest.write('#define %s %s\n'%(key,value)) + for include_file in self.env[COMMON_INCLUDES]: + dest.write('\n#include "%s"'%include_file) + dest.write('\n#endif /* %s */\n'%waf_guard) + dest.close() +def run_check(self,obj): + if not obj.code: + raise Configure.ConfigurationError('run_check: no code to process in check') + dir=os.path.join(self.blddir,'.wscript-trybuild') + for(root,dirs,filenames)in os.walk(dir): + for f in list(filenames): + os.remove(os.path.join(root,f)) + bdir=os.path.join(dir,'testbuild') + if(self.env['CXX_NAME']and not obj.force_compiler and Task.TaskBase.classes.get('cxx',None))or obj.force_compiler=="cxx": + tp='cxx' + test_f_name='test.cpp' + else: + tp='cc' + test_f_name='test.c' + if not os.path.exists(dir): + os.makedirs(dir) + if not os.path.exists(bdir): + os.makedirs(bdir) + if obj.env:env=obj.env + else:env=self.env.copy() + dest=open(os.path.join(dir,test_f_name),'w') + dest.write(obj.code) + dest.close() + back=os.path.abspath('.') + bld=Build.BuildContext() + bld.log=self.log + bld.all_envs.update(self.all_envs) + bld.all_envs['default']=env + bld._variants=bld.all_envs.keys() + bld.load_dirs(dir,bdir) + os.chdir(dir) + bld.rescan(bld.srcnode) + o=bld.new_task_gen(tp,obj.build_type) + o.source=test_f_name + o.target='testprog' + o.uselib=obj.uselib + o.includes=obj.includes + self.log.write("==>\n%s\n<==\n"%obj.code) + try: + ret=bld.compile() + except Build.BuildError: + ret=1 + if obj.execute: + lastprog=o.link_task.outputs[0].abspath(o.env) + os.chdir(back) + if obj.execute: + if ret:return not ret + data=Utils.cmd_output('"%s"'%lastprog).strip() + ret={'result':data} + return ret + return not ret +def cc_check_features(self,kind='cc'): + v=self.env + test=Configure.check_data() + test.code='int main() {return 0;}\n' + test.env=v + test.execute=1 + test.force_compiler=kind + ret=self.run_check(test) + self.check_message('compiler could create','programs',not(ret is False)) + if not ret:self.fatal("no programs") + lib_obj=Configure.check_data() + lib_obj.code="int k = 3;\n" + lib_obj.env=v + lib_obj.build_type="shlib" + lib_obj.force_compiler=kind + ret=self.run_check(lib_obj) + self.check_message('compiler could create','shared libs',not(ret is False)) + if not ret:self.fatal("no shared libs") + lib_obj=Configure.check_data() + lib_obj.code="int k = 3;\n" + lib_obj.env=v + lib_obj.build_type="staticlib" + lib_obj.force_compiler=kind + ret=self.run_check(lib_obj) + self.check_message('compiler could create','static libs',not(ret is False)) + if not ret:self.fatal("no static libs") +def cxx_check_features(self): + return cc_check_features(self,kind='cpp') +def check_pkg(self,modname,destvar='',vnum='',pkgpath='',pkgbin='',pkgvars=[],pkgdefs={},mandatory=False): + pkgconf=self.create_pkgconfig_configurator() + if not destvar:destvar=modname.upper() + pkgconf.uselib_store=destvar + pkgconf.name=modname + pkgconf.version=vnum + if pkgpath:pkgconf.pkgpath=pkgpath + pkgconf.binary=pkgbin + pkgconf.variables=pkgvars + pkgconf.defines=pkgdefs + pkgconf.mandatory=mandatory + return pkgconf.run() +def pkgconfig_fetch_variable(self,pkgname,variable,pkgpath='',pkgbin='',pkgversion=0): + if not pkgbin:pkgbin='pkg-config' + if pkgpath:pkgpath='PKG_CONFIG_PATH=$PKG_CONFIG_PATH:'+pkgpath + pkgcom='%s %s'%(pkgpath,pkgbin) + if pkgversion: + ret=os.popen("%s --atleast-version=%s %s"%(pkgcom,pkgversion,pkgname)).close() + self.conf.check_message('package %s >= %s'%(pkgname,pkgversion),'',not ret) + if ret: + return'' + else: + ret=os.popen("%s %s"%(pkgcom,pkgname)).close() + self.check_message('package %s '%(pkgname),'',not ret) + if ret: + return'' + return Utils.cmd_output('%s --variable=%s %s'%(pkgcom,variable,pkgname)).strip() + +conf(define) +conf(undefine) +conf(define_cond) +conf(is_defined) +conf(get_define) +conf(have_define) +conf(write_config_header) +conf(run_check) +conftest(cc_check_features) +conftest(cxx_check_features) +conf(check_pkg) +conf(pkgconfig_fetch_variable) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/cc.py @@ -0,0 +1,74 @@ +#! /usr/bin/env python +# encoding: utf-8 +import sys +if sys.hexversion < 0x020400f0: from sets import Set as set +import TaskGen,Build,Utils,Task +from Logs import debug +import ccroot +from TaskGen import taskgen,before,extension +g_cc_flag_vars=['CCDEPS','FRAMEWORK','FRAMEWORKPATH','STATICLIB','LIB','LIBPATH','LINKFLAGS','RPATH','CCFLAGS','CPPPATH','CPPFLAGS','CCDEFINES'] +EXT_CC=['.c'] +CC_METHS=['init_cc','apply_type_vars','apply_incpaths','apply_defines_cc','apply_core','apply_lib_vars','apply_obj_vars_cc'] +TaskGen.bind_feature('cc',CC_METHS) +g_cc_type_vars=['CCFLAGS','LINKFLAGS'] +class cc_taskgen(ccroot.ccroot_abstract): + def __init__(self,*kw): + ccroot.ccroot_abstract.__init__(self,*kw) +def init_cc(self): + if hasattr(self,'p_flag_vars'):self.p_flag_vars=set(self.p_flag_vars).union(g_cc_flag_vars) + else:self.p_flag_vars=g_cc_flag_vars + if hasattr(self,'p_type_vars'):self.p_type_vars=set(self.p_type_vars).union(g_cc_type_vars) + else:self.p_type_vars=g_cc_type_vars + if not self.env['CC_NAME']: + raise Utils.WafError("At least one compiler (gcc, ..) must be selected") +def apply_obj_vars_cc(self): + env=self.env + app=env.append_unique + cpppath_st=env['CPPPATH_ST'] + for i in env['INC_PATHS']: + app('_CCINCFLAGS',cpppath_st%i.bldpath(env)) + app('_CCINCFLAGS',cpppath_st%i.srcpath(env)) + for i in env['CPPPATH']: + app('_CCINCFLAGS',cpppath_st%i) + app('_CCINCFLAGS',cpppath_st%'.') + app('_CCINCFLAGS',cpppath_st%env.variant()) + tmpnode=self.path + app('_CCINCFLAGS',cpppath_st%tmpnode.bldpath(env)) + app('_CCINCFLAGS',cpppath_st%tmpnode.srcpath(env)) +def apply_defines_cc(self): + tree=Build.bld + self.defines=getattr(self,'defines',[]) + lst=self.to_list(self.defines)+self.to_list(self.env['CCDEFINES']) + milst=[] + for defi in lst: + if not defi in milst: + milst.append(defi) + libs=self.to_list(self.uselib) + for l in libs: + val=self.env['CCDEFINES_'+l] + if val:milst+=val + self.env['DEFLINES']=["%s %s"%(x[0],Utils.trimquotes('='.join(x[1:])))for x in[y.split('=')for y in milst]] + y=self.env['CCDEFINES_ST'] + self.env['_CCDEFFLAGS']=[y%x for x in milst] +def c_hook(self,node): + task=self.create_task('cc') + try:obj_ext=self.obj_ext + except AttributeError:obj_ext='_%d.o'%self.idx + task.defines=self.scanner_defines + task.inputs=[node] + task.outputs=[node.change_ext(obj_ext)] + self.compiled_tasks.append(task) +cc_str='${CC} ${CCFLAGS} ${CPPFLAGS} ${_CCINCFLAGS} ${_CCDEFFLAGS} ${CC_SRC_F}${SRC} ${CC_TGT_F}${TGT}' +link_str='${LINK_CC} ${CCLNK_SRC_F}${SRC} ${CCLNK_TGT_F}${TGT} ${LINKFLAGS} ${_LIBDIRFLAGS} ${_LIBFLAGS}' +cls=Task.simple_task_type('cc',cc_str,'GREEN',ext_out='.o',ext_in='.c') +cls.scan=ccroot.scan +cls.vars=('CCDEFINES',) +cls=Task.simple_task_type('cc_link',link_str,color='YELLOW',ext_in='.o') +cls.maxjobs=1 +TaskGen.declare_order('apply_incpaths','apply_defines_cc','apply_core','apply_lib_vars','apply_obj_vars_cc','apply_obj_vars') + +taskgen(init_cc) +before('apply_type_vars')(init_cc) +taskgen(apply_obj_vars_cc) +taskgen(apply_defines_cc) +extension(EXT_CC)(c_hook) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/gcc.py @@ -0,0 +1,115 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys +import Configure,Options,Utils +import ccroot,ar +from Configure import conftest +def find_gcc(conf): + v=conf.env + cc=None + if v['CC']:cc=v['CC'] + elif'CC'in os.environ:cc=os.environ['CC'] + if not cc:cc=conf.find_program('gcc',var='CC') + if not cc:cc=conf.find_program('cc',var='CC') + if not cc:conf.fatal('gcc was not found') + try: + if Utils.cmd_output('%s --version'%cc).find('gcc')<0: + conf.fatal('gcc was not found, see the result of gcc --version') + except ValueError: + conf.fatal('gcc --version could not be executed') + v['CC']=cc + v['CC_NAME']='gcc' + ccroot.get_cc_version(conf,cc,'CC_VERSION') +def gcc_common_flags(conf): + v=conf.env + v['CC_SRC_F']='' + v['CC_TGT_F']='-c -o ' + v['CPPPATH_ST']='-I%s' + if not v['LINK_CC']:v['LINK_CC']=v['CC'] + v['CCLNK_SRC_F']='' + v['CCLNK_TGT_F']='-o ' + v['LIB_ST']='-l%s' + v['LIBPATH_ST']='-L%s' + v['STATICLIB_ST']='-l%s' + v['STATICLIBPATH_ST']='-L%s' + v['CCDEFINES_ST']='-D%s' + v['SHLIB_MARKER']='-Wl,-Bdynamic' + v['STATICLIB_MARKER']='-Wl,-Bstatic' + v['program_PATTERN']='%s' + v['shlib_CCFLAGS']=['-fPIC','-DPIC'] + v['shlib_LINKFLAGS']=['-shared'] + v['shlib_PATTERN']='lib%s.so' + v['staticlib_LINKFLAGS']=['-Wl,-Bstatic'] + v['staticlib_PATTERN']='lib%s.a' + v['MACBUNDLE_LINKFLAGS']=['-bundle','-undefined dynamic_lookup'] + v['MACBUNDLE_CCFLAGS']=['-fPIC'] + v['MACBUNDLE_PATTERN']='%s.bundle' +def gcc_modifier_win32(conf): + v=conf.env + if sys.platform!='win32':return + v['program_PATTERN']='%s.exe' + v['shlib_PATTERN']='lib%s.dll' + v['shlib_CCFLAGS']=[] + v['staticlib_LINKFLAGS']=[] +def gcc_modifier_cygwin(conf): + v=conf.env + if sys.platform!='cygwin':return + v['program_PATTERN']='%s.exe' + v['shlib_PATTERN']='lib%s.dll' + v['shlib_CCFLAGS']=[] + v['staticlib_LINKFLAGS']=[] +def gcc_modifier_darwin(conf): + v=conf.env + if sys.platform!='darwin':return + v['shlib_CCFLAGS']=['-fPIC','-compatibility_version 1','-current_version 1'] + v['shlib_LINKFLAGS']=['-dynamiclib'] + v['shlib_PATTERN']='lib%s.dylib' + v['staticlib_LINKFLAGS']=[] + v['SHLIB_MARKER']='' + v['STATICLIB_MARKER']='' +def gcc_modifier_aix5(conf): + v=conf.env + if sys.platform!='aix5':return + v['program_LINKFLAGS']=['-Wl,-brtl'] + v['shlib_LINKFLAGS']=['-shared','-Wl,-brtl,-bexpfull'] + v['SHLIB_MARKER']='' +def gcc_modifier_debug(conf): + v=conf.env + if conf.check_flags('-O2'): + v['CCFLAGS_OPTIMIZED']=['-O2'] + v['CCFLAGS_RELEASE']=['-O2'] + if conf.check_flags('-g -DDEBUG'): + v['CCFLAGS_DEBUG']=['-g','-DDEBUG'] + v['LINKFLAGS_DEBUG']=['-g'] + if conf.check_flags('-g3 -O0 -DDEBUG'): + v['CCFLAGS_ULTRADEBUG']=['-g3','-O0','-DDEBUG'] + v['LINKFLAGS_ULTRADEBUG']=['-g'] + if conf.check_flags('-Wall'): + for x in'OPTIMIZED RELEASE DEBUG ULTRADEBUG'.split():v.append_unique('CCFLAGS_'+x,'-Wall') + try: + debug_level=Options.options.debug_level.upper() + except AttributeError: + debug_level=ccroot.DEBUG_LEVELS.CUSTOM + v.append_value('CCFLAGS',v['CCFLAGS_'+debug_level]) + v.append_value('LINKFLAGS',v['LINKFLAGS_'+debug_level]) +detect=''' +find_gcc +find_cpp +find_ar +gcc_common_flags +gcc_modifier_win32 +gcc_modifier_cygwin +gcc_modifier_darwin +gcc_modifier_aix5 +cc_load_tools +cc_add_flags +''' + +conftest(find_gcc) +conftest(gcc_common_flags) +conftest(gcc_modifier_win32) +conftest(gcc_modifier_cygwin) +conftest(gcc_modifier_darwin) +conftest(gcc_modifier_aix5) +conftest(gcc_modifier_debug) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/qt4.py @@ -0,0 +1,380 @@ +#! /usr/bin/env python +# encoding: utf-8 + +try: + from xml.sax import make_parser + from xml.sax.handler import ContentHandler +except ImportError: + has_xml=False + ContentHandler=object +else: + has_xml=True +import os,sys +import ccroot,cxx +import TaskGen,Task,Utils,Runner,Options,Build +from TaskGen import taskgen,feature,after,extension +from Logs import error +from Constants import* +MOC_H=['.h','.hpp','.hxx','.hh'] +EXT_RCC=['.qrc'] +EXT_UI=['.ui'] +EXT_QT4=['.cpp','.cc','.cxx','.C'] +class MTask(Task.Task): + scan=ccroot.scan + before=['cxx_link','ar_link_static'] + def __init__(self,parent): + Task.Task.__init__(self,parent.env) + self.moc_done=0 + self.parent=parent + def runnable_status(self): + if self.moc_done: + for t in self.run_after: + if not t.hasrun: + return ASK_LATER + self.signature() + return Task.Task.runnable_status(self) + else: + self.add_moc_tasks() + return ASK_LATER + def add_moc_tasks(self): + tree=Build.bld + parn=self.parent + node=self.inputs[0] + try: + self.signature() + except: + print"TODO" + else: + delattr(self,'sign_all') + moctasks=[] + mocfiles=[] + variant=node.variant(parn.env) + try: + tmp_lst=tree.raw_deps[self.unique_id()] + tree.raw_deps[self.unique_id()]=[] + except KeyError: + tmp_lst=[] + for d in tmp_lst: + if not d.endswith('.moc'):continue + if d in mocfiles: + error("paranoia owns") + continue + mocfiles.append(d) + ext='' + try:ext=Options.options.qt_header_ext + except AttributeError:pass + if not ext: + base2=d[:-4] + path=node.parent.srcpath(parn.env) + for i in MOC_H: + try: + os.stat(os.path.join(path,base2+i)) + except OSError: + pass + else: + ext=i + break + if not ext:raise Utils.WafError("no header found for %s which is a moc file"%str(d)) + h_node=node.parent.find_resource(base2+i) + m_node=h_node.change_ext('.moc') + tree.node_deps[(self.unique_id(),m_node.name)]=h_node + task=Task.TaskBase.classes['moc'](parn.env,normal=0) + task.set_inputs(h_node) + task.set_outputs(m_node) + generator=Build.bld.generator + generator.outstanding.insert(0,task) + generator.total+=1 + moctasks.append(task) + tmp_lst=tree.raw_deps[self.unique_id()]=mocfiles + lst=tree.node_deps.get(self.unique_id(),()) + for d in lst: + name=d.name + if name.endswith('.moc'): + task=Task.TaskBase.classes['moc'](parn.env,normal=0) + task.set_inputs(tree.node_deps[(self.unique_id(),name)]) + task.set_outputs(d) + generator=Build.bld.generator + generator.outstanding.insert(0,task) + generator.total+=1 + moctasks.append(task) + self.run_after=moctasks + self.moc_done=1 + run=Task.TaskBase.classes['cxx'].__dict__['run'] +def translation_update(task): + outs=[a.abspath(task.env)for a in task.outputs] + outs=" ".join(outs) + lupdate=task.env['QT_LUPDATE'] + for x in task.inputs: + file=x.abspath(task.env) + cmd="%s %s -ts %s"%(lupdate,file,outs) + Utils.pprint('BLUE',cmd) + Runner.exec_command(cmd) +class XMLHandler(ContentHandler): + def __init__(self): + self.buf=[] + self.files=[] + def startElement(self,name,attrs): + if name=='file': + self.buf=[] + def endElement(self,name): + if name=='file': + self.files.append(''.join(self.buf)) + def characters(self,cars): + self.buf.append(cars) +def scan(self): + node=self.inputs[0] + parser=make_parser() + curHandler=XMLHandler() + parser.setContentHandler(curHandler) + fi=open(self.inputs[0].abspath(self.env)) + parser.parse(fi) + fi.close() + nodes=[] + names=[] + root=self.inputs[0].parent + for x in curHandler.files: + x=x.encode('utf8') + nd=root.find_resource(x) + if nd:nodes.append(nd) + else:names.append(x) + return(nodes,names) +def create_rcc_task(self,node): + rcnode=node.change_ext('_rc.cpp') + rcctask=self.create_task('rcc') + rcctask.inputs=[node] + rcctask.outputs=[rcnode] + cpptask=self.create_task('cxx') + cpptask.inputs=[rcnode] + cpptask.outputs=[rcnode.change_ext('.o')] + cpptask.defines=self.scanner_defines + self.compiled_tasks.append(cpptask) + return cpptask +def create_uic_task(self,node): + uictask=self.create_task('ui4') + uictask.inputs=[node] + uictask.outputs=[node.change_ext('.h')] +class qt4_taskgen(cxx.cxx_taskgen): + def __init__(self,*kw): + cxx.cxx_taskgen.__init__(self,*kw) + self.link_task=None + self.lang='' + self.langname='' + self.update=0 + self.features.append('qt4') +def apply_qt4(self): + if self.lang: + lst=[] + trans=[] + for l in self.to_list(self.lang): + t=Task.TaskBase.classes['ts2qm'](self.env) + t.set_inputs(self.path.find_resource(l+'.ts')) + t.set_outputs(t.inputs[0].change_ext('.qm')) + lst.append(t.outputs[0]) + if self.update: + trans.append(t.inputs[0]) + if self.update and Options.options.trans_qt4: + u=Task.TaskCmd(translation_update,self.env,2) + u.inputs=[a.inputs[0]for a in self.compiled_tasks] + u.outputs=trans + if self.langname: + t=Task.TaskBase.classes['qm2rcc'](self.env) + t.set_inputs(lst) + t.set_outputs(self.path.find_or_declare(self.langname+'.qrc')) + t.path=self.path + k=create_rcc_task(self,t.outputs[0]) + self.link_task.inputs.append(k.outputs[0]) + lst=[] + for flag in self.to_list(self.env['CXXFLAGS']): + if len(flag)<2:continue + if flag[0:2]=='-D'or flag[0:2]=='-I': + lst.append(flag) + self.env['MOC_FLAGS']=lst +def find_sources_in_dirs(self,dirnames,excludes=[],exts=[]): + lst=[] + excludes=self.to_list(excludes) + dirnames=self.to_list(dirnames) + ext_lst=exts or self.mappings.keys()+TaskGen.task_gen.mappings.keys() + for name in dirnames: + anode=self.path.find_dir(name) + Build.bld.rescan(anode) + for name in Build.bld.cache_dir_contents[anode.id]: + (base,ext)=os.path.splitext(name) + if ext in ext_lst: + if not name in lst: + if name in excludes:continue + lst.append((anode.path_to_parent(self.path)or'.')+'/'+name) + elif ext=='.ts': + self.lang+=' '+base + lst.sort() + self.source=self.source+' '+(" ".join(lst)) +setattr(qt4_taskgen,'find_sources_in_dirs',find_sources_in_dirs) +def cxx_hook(self,node): + task=MTask(self) + self.tasks.append(task) + try:obj_ext=self.obj_ext + except AttributeError:obj_ext='_%d.o'%self.idx + task.defines=self.scanner_defines + task.inputs=[node] + task.outputs=[node.change_ext(obj_ext)] + self.compiled_tasks.append(task) +def process_qm2rcc(task): + outfile=task.outputs[0].abspath(task.env) + f=open(outfile,'w') + f.write('\n\n') + for k in task.inputs: + f.write(' ') + f.write(k.path_to_parent(task.path)) + f.write('\n') + f.write('\n') + f.close() +b=Task.simple_task_type +b('moc','${QT_MOC} ${MOC_FLAGS} ${SRC} ${MOC_ST} ${TGT}',color='BLUE',vars=['QT_MOC','MOC_FLAGS']) +cls=b('rcc','${QT_RCC} -name ${SRC[0].name} ${SRC[0].abspath(env)} ${RCC_ST} -o ${TGT}',color='BLUE',before='cxx moc',after="qm2rcc") +cls.scan=scan +b('ui4','${QT_UIC} ${SRC} -o ${TGT}',color='BLUE',before='cxx moc') +b('ts2qm','${QT_LRELEASE} ${QT_LRELEASE_FLAGS} ${SRC} -qm ${TGT}',color='BLUE',before='qm2rcc') +Task.task_type_from_func('qm2rcc',vars=[],func=process_qm2rcc,color='BLUE',before='rcc',after='ts2qm') +def detect_qt4(conf): + env=conf.env + opt=Options.options + qtlibs=getattr(opt,'qtlibs','') + qtincludes=getattr(opt,'qtincludes','') + qtbin=getattr(opt,'qtbin','') + useframework=getattr(opt,'use_qt4_osxframework',True) + qtdir=getattr(opt,'qtdir','') + if not qtdir:qtdir=os.environ.get('QT4_ROOT','') + if not qtdir: + try: + lst=os.listdir('/usr/local/Trolltech/') + lst.sort() + lst.reverse() + qtdir='/usr/local/Trolltech/%s/'%lst[0] + except OSError: + pass + if not qtdir: + try: + path=os.environ['PATH'].split(':') + for qmk in['qmake-qt4','qmake4','qmake']: + qmake=conf.find_program(qmk,path) + if qmake: + version=Utils.cmd_output(qmake+" -query QT_VERSION").strip().split('.') + if version[0]=="4": + qtincludes=Utils.cmd_output(qmake+" -query QT_INSTALL_HEADERS").strip() + qtdir=Utils.cmd_output(qmake+" -query QT_INSTALL_PREFIX").strip()+"/" + qtbin=Utils.cmd_output(qmake+" -query QT_INSTALL_BINS").strip()+"/" + break + except OSError: + pass + if not qtlibs:qtlibs=os.path.join(qtdir,'lib') + vars="QtCore QtGui QtNetwork QtOpenGL QtSql QtSvg QtTest QtXml QtWebKit Qt3Support".split() + framework_ok=False + if sys.platform=="darwin"and useframework: + for i in vars: + e=conf.create_framework_configurator() + e.path=[qtlibs,'/Library/Frameworks'] + e.name=i + e.remove_dot_h=True + e.run() + if not i=='QtCore': + for r in env['CCFLAGS_'+i.upper()]: + if r.startswith('-F'): + env['CCFLAGS_'+i.upper()].remove(r) + break + if conf.is_defined("HAVE_QTOPENGL")and not'-framework OpenGL'in env["LINKFLAGS_QTOPENGL"]: + env["LINKFLAGS_QTOPENGL"]+=['-framework OpenGL'] + if conf.is_defined("HAVE_QTGUI"): + if not'-framework AppKit'in env["LINKFLAGS_QTGUI"]: + env["LINKFLAGS_QTGUI"]+=['-framework AppKit'] + if not'-framework ApplicationServices'in env["LINKFLAGS_QTGUI"]: + env["LINKFLAGS_QTGUI"]+=['-framework ApplicationServices'] + framework_ok=True + if not conf.is_defined("HAVE_QTGUI"): + if not qtincludes:qtincludes=os.path.join(qtdir,'include') + env['QTINCLUDEPATH']=qtincludes + lst=[qtincludes,'/usr/share/qt4/include/','/opt/qt4/include'] + test=conf.create_header_enumerator() + test.name='QtGui/QFont' + test.path=lst + test.mandatory=1 + ret=test.run() + if not qtbin:qtbin=os.path.join(qtdir,'bin') + binpath=[qtbin,'/usr/share/qt4/bin/']+os.environ['PATH'].split(':') + def find_bin(lst,var): + for f in lst: + ret=conf.find_program(f,path_list=binpath) + if ret: + env[var]=ret + break + find_bin(['uic-qt3','uic3'],'QT_UIC3') + find_bin(['uic-qt4','uic'],'QT_UIC') + version=Utils.cmd_output(env['QT_UIC']+" -version 2>&1").strip() + version=version.replace('Qt User Interface Compiler ','') + version=version.replace('User Interface Compiler for Qt','') + if version.find(" 3.")!=-1: + conf.check_message('uic version','(too old)',0,option='(%s)'%version) + sys.exit(1) + conf.check_message('uic version','',1,option='(%s)'%version) + find_bin(['moc-qt4','moc'],'QT_MOC') + find_bin(['rcc'],'QT_RCC') + find_bin(['lrelease-qt4','lrelease'],'QT_LRELEASE') + find_bin(['lupdate-qt4','lupdate'],'QT_LUPDATE') + env['UIC3_ST']='%s -o %s' + env['UIC_ST']='%s -o %s' + env['MOC_ST']='-o' + env['QT_LRELEASE_FLAGS']=['-silent'] + if not framework_ok: + vars_debug=[a+'_debug'for a in vars] + for i in vars_debug+vars: + pkgconf=conf.create_pkgconfig_configurator() + pkgconf.name=i + pkgconf.pkgpath='%s:%s/pkgconfig:/usr/lib/qt4/lib/pkgconfig:/opt/qt4/lib/pkgconfig:/usr/lib/qt4/lib:/opt/qt4/lib'%(qtlibs,qtlibs) + pkgconf.run() + def process_lib(vars_,coreval): + for d in vars_: + var=d.upper() + if var=='QTCORE':continue + value=env['LIBPATH_'+var] + if value: + core=env[coreval] + accu=[] + for lib in value: + if lib in core:continue + accu.append(lib) + env['LIBPATH_'+var]=accu + process_lib(vars,'LIBPATH_QTCORE') + process_lib(vars_debug,'LIBPATH_QTCORE_DEBUG') + if Options.options.want_rpath: + def process_rpath(vars_,coreval): + for d in vars_: + var=d.upper() + value=env['LIBPATH_'+var] + if value: + core=env[coreval] + accu=[] + for lib in value: + if var!='QTCORE': + if lib in core: + continue + accu.append('-Wl,--rpath='+lib) + env['RPATH_'+var]=accu + process_rpath(vars,'LIBPATH_QTCORE') + process_rpath(vars_debug,'LIBPATH_QTCORE_DEBUG') + env['QTLOCALE']=str(env['PREFIX'])+'/share/locale' +def detect(conf): + if sys.platform=='win32':conf.fatal('Qt4.py will not work on win32 for now - ask the author') + detect_qt4(conf) +def set_options(opt): + opt.add_option('--want-rpath',type='int',default=1,dest='want_rpath',help='set rpath to 1 or 0 [Default 1]') + opt.add_option('--header-ext',type='string',default='',help='header extension for moc files',dest='qt_header_ext') + for i in"qtdir qtincludes qtlibs qtbin".split(): + opt.add_option('--'+i,type='string',default='',dest=i) + if sys.platform=="darwin": + opt.add_option('--no-qt4-framework',action="store_false",help='do not use the framework version of Qt4 in OS X',dest='use_qt4_osxframework',default=True) + opt.add_option('--translate',action="store_true",help="collect translation strings",dest="trans_qt4",default=False) + +extension(EXT_RCC)(create_rcc_task) +extension(EXT_UI)(create_uic_task) +taskgen(apply_qt4) +feature('qt4')(apply_qt4) +after('apply_link')(apply_qt4) +extension(EXT_QT4)(cxx_hook) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/intltool.py @@ -0,0 +1,81 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,re +import TaskGen,Task,Utils,Runner,Options,Build +import cc +from Logs import error +class intltool_in_taskgen(TaskGen.task_gen): + def __init__(self,*k): + TaskGen.task_gen.__init__(self,*k) + self.source='' + self.flags='' + self.podir='po' + self.intlcache='.intlcache' + self.tasks=[] + def apply(self): + self.env=self.env.copy() + tree=Build.bld + for i in self.to_list(self.source): + node=self.path.find_resource(i) + podirnode=self.path.find_dir(self.podir) + self.env['INTLCACHE']=os.path.join(self.path.bldpath(self.env),self.podir,self.intlcache) + self.env['INTLPODIR']=podirnode.srcpath(self.env) + self.env['INTLFLAGS']=self.flags + task=self.create_task('intltool') + task.set_inputs(node) + task.set_outputs(node.change_ext('')) + task.install_path=self.install_path +class intltool_po_taskgen(TaskGen.task_gen): + def __init__(self,*k,**kw): + TaskGen.task_gen.__init__(self,*k) + self.chmod=0644 + self.default_install_path='${LOCALEDIR}' + self.appname=kw.get('appname','set_your_app_name') + self.podir='' + self.tasks=[] + def apply(self): + def install_translation(task): + out=task.outputs[0] + filename=out.name + (langname,ext)=os.path.splitext(filename) + inst_file=langname+os.sep+'LC_MESSAGES'+os.sep+self.appname+'.mo' + Build.bld.install_as(os.path.join(self.install_path,inst_file),out.abspath(self.env),chmod=self.chmod) + linguas=self.path.find_resource(os.path.join(self.podir,'LINGUAS')) + if linguas: + file=open(linguas.abspath()) + langs=[] + for line in file.readlines(): + if not line.startswith('#'): + langs+=line.split() + file.close() + re_linguas=re.compile('[-a-zA-Z_@.]+') + for lang in langs: + if re_linguas.match(lang): + node=self.path.find_resource(os.path.join(self.podir,re_linguas.match(lang).group()+'.po')) + task=self.create_task('po') + task.set_inputs(node) + task.set_outputs(node.change_ext('.mo')) + if Options.is_install:task.install=install_translation + else: + Utils.pprint('RED',"Error no LINGUAS file found in po directory") +Task.simple_task_type('po','${POCOM} -o ${TGT} ${SRC}',color='BLUE') +Task.simple_task_type('intltool','${INTLTOOL} ${INTLFLAGS} -q -u -c ${INTLCACHE} ${INTLPODIR} ${SRC} ${TGT}',color='BLUE',after="cc_link cxx_link") +def detect(conf): + conf.check_tool('checks') + pocom=conf.find_program('msgfmt') + conf.env['POCOM']=pocom + intltool=conf.find_program('intltool-merge') + conf.env['INTLTOOL']=intltool + def getstr(varname): + return getattr(Options.options,varname,'') + prefix=conf.env['PREFIX'] + datadir=getstr('datadir') + if not datadir:datadir=os.path.join(prefix,'share') + conf.define('LOCALEDIR',os.path.join(datadir,'locale')) + conf.define('DATADIR',datadir) + conf.check_header('locale.h','HAVE_LOCALE_H') +def set_options(opt): + opt.add_option('--want-rpath',type='int',default=1,dest='want_rpath',help='set rpath to 1 or 0 [Default 1]') + opt.add_option('--datadir',type='string',default='',dest='datadir',help='read-only application data') + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/compiler_cc.py @@ -0,0 +1,33 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,imp,types,ccroot +import optparse +import Utils,checks,Configure,Options +c_compiler={'win32':['msvc','gcc'],'cygwin':['gcc'],'darwin':['gcc'],'aix5':['gcc'],'linux':['gcc','suncc'],'sunos':['suncc','gcc'],'irix':['gcc'],'hpux':['gcc'],'default':['gcc']} +def __list_possible_compiler(platform): + try: + return c_compiler[platform] + except KeyError: + return c_compiler["default"] +def detect(conf): + try:test_for_compiler=Options.options.check_c_compiler + except AttributeError:raise Configure.ConfigurationError("Add set_options(opt): opt.tool_options('compiler_cc')") + for c_compiler in test_for_compiler.split(): + conf.check_tool(c_compiler) + if conf.env['CC']: + conf.check_message("%s"%c_compiler,'',True) + conf.env["COMPILER_CC"]="%s"%c_compiler + return + conf.check_message("%s"%c_compiler,'',False) + conf.env["COMPILER_CC"]=None +def set_options(opt): + detected_platform=checks.detect_platform(None) + possible_compiler_list=__list_possible_compiler(detected_platform) + test_for_compiler=str(" ").join(possible_compiler_list) + cc_compiler_opts=opt.add_option_group("C Compiler Options") + cc_compiler_opts.add_option('--check-c-compiler',default="%s"%test_for_compiler,help='On this platform (%s) the following C-Compiler will be checked by default: "%s"'%(detected_platform,test_for_compiler),dest="check_c_compiler") + for c_compiler in test_for_compiler.split(): + opt.tool_options('%s'%c_compiler,option_group=cc_compiler_opts) + opt.add_option('-d','--debug-level',action='store',default=ccroot.DEBUG_LEVELS.RELEASE,help="Specify the debug level, does nothing if CFLAGS is set in the environment. [Allowed Values: '%s']"%"', '".join(ccroot.DEBUG_LEVELS.ALL),choices=ccroot.DEBUG_LEVELS.ALL,dest='debug_level') + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/ocaml.py @@ -0,0 +1,253 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,re +import TaskGen,Utils,Task,Build +from Logs import error +from TaskGen import taskgen,feature,before,after,extension +EXT_MLL=['.mll'] +EXT_MLY=['.mly'] +EXT_MLI=['.mli'] +EXT_MLC=['.c'] +EXT_ML=['.ml'] +open_re=re.compile('open ([a-zA-Z]+);;',re.M) +foo=re.compile(r"""(\(\*)|(\*\))|("(\\.|[^"\\])*"|'(\\.|[^'\\])*'|.[^()*"'\\]*)""",re.M) +def filter_comments(txt): + meh=[0] + def repl(m): + if m.group(1):meh[0]+=1 + elif m.group(2):meh[0]-=1 + elif not meh[0]:return m.group(0) + return'' + return foo.sub(repl,txt) +def scan(self): + node=self.inputs[0] + code=filter_comments(node.read(self.env)) + global open_re + names=[] + import_iterator=open_re.finditer(code) + if import_iterator: + for import_match in import_iterator: + names.append(import_match.group(1)) + found_lst=[] + raw_lst=[] + for name in names: + nd=None + for x in self.incpaths: + nd=x.find_resource(name.lower()+'.ml') + if nd: + found_lst.append(nd) + break + else: + raw_lst.append(name) + return(found_lst,raw_lst) +native_lst=['native','all','c_object'] +bytecode_lst=['bytecode','all'] +class ocaml_taskgen(TaskGen.task_gen): + def __init__(self,*k,**kw): + TaskGen.task_gen.__init__(self,*k,**kw) + self.type='all' + self._incpaths_lst=[] + self._bld_incpaths_lst=[] + self._mlltasks=[] + self._mlytasks=[] + self.mlitasks=[] + self.native_tasks=[] + self.bytecode_tasks=[] + self.linktasks=[] + self.bytecode_env=None + self.native_env=None + self.compiled_tasks=[] + self.includes='' + self.uselib='' + self.out_nodes=[] + self.are_deps_set=0 + if not self.type in['bytecode','native','all','c_object']: + print'type for camlobj is undefined '+self.type + self.type='all' +TaskGen.bind_feature('ocaml','apply_core') +def init_envs_ml(self): + self.islibrary=getattr(self,'islibrary',False) + global native_lst,bytecode_lst + self.native_env=None + if self.type in native_lst: + self.native_env=self.env.copy() + if self.islibrary:self.native_env['OCALINKFLAGS']='-a' + self.bytecode_env=None + if self.type in bytecode_lst: + self.bytecode_env=self.env.copy() + if self.islibrary:self.bytecode_env['OCALINKFLAGS']='-a' + if self.type=='c_object': + self.native_env['OCALINK']=self.native_env['OCALINK']+' -output-obj' +def apply_incpaths_ml(self): + inc_lst=self.includes.split() + lst=self._incpaths_lst + tree=Build.bld + for dir in inc_lst: + node=self.path.find_dir(dir) + if not node: + error("node not found: "+str(dir)) + continue + Build.bld.rescan(node) + if not node in lst:lst.append(node) + self._bld_incpaths_lst.append(node) +def apply_vars_ml(self): + for i in self._incpaths_lst: + if self.bytecode_env: + self.bytecode_env.append_value('OCAMLPATH','-I %s'%i.srcpath(self.env)) + self.bytecode_env.append_value('OCAMLPATH','-I %s'%i.bldpath(self.env)) + if self.native_env: + self.native_env.append_value('OCAMLPATH','-I %s'%i.bldpath(self.env)) + self.native_env.append_value('OCAMLPATH','-I %s'%i.srcpath(self.env)) + varnames=['INCLUDES','OCAMLFLAGS','OCALINKFLAGS','OCALINKFLAGS_OPT'] + for name in self.uselib.split(): + for vname in varnames: + cnt=self.env[vname+'_'+name] + if cnt: + if self.bytecode_env:self.bytecode_env.append_value(vname,cnt) + if self.native_env:self.native_env.append_value(vname,cnt) +def apply_link_ml(self): + if self.bytecode_env: + ext=self.islibrary and'.cma'or'.run' + linktask=self.create_task('ocalink') + linktask.bytecode=1 + linktask.set_outputs(self.path.find_or_declare(self.target+ext)) + linktask.obj=self + linktask.env=self.bytecode_env + self.linktasks.append(linktask) + if self.native_env: + if getattr(self,'c_objects',''):ext='.o' + elif self.islibrary:ext='.cmxa' + else:ext='' + linktask=self.create_task('ocalinkx') + linktask.set_outputs(self.path.find_or_declare(self.target+ext)) + linktask.obj=self + linktask.env=self.native_env + self.linktasks.append(linktask) + self.out_nodes+=linktask.outputs + if self.type=='c_object':self.compiled_tasks.append(linktask) +def mll_hook(self,node): + mll_task=self.create_task('ocamllex',self.native_env) + mll_task.set_inputs(node) + mll_task.set_outputs(node.change_ext('.ml')) + self.mlltasks.append(mll_task) + self.allnodes.append(mll_task.outputs[0]) +def mly_hook(self,node): + mly_task=self.create_task('ocamlyacc',self.native_env) + mly_task.set_inputs(node) + mly_task.set_outputs([node.change_ext('.ml'),node.change_ext('.mli')]) + self._mlytasks.append(mly_task) + self.allnodes.append(mly_task.outputs[0]) + task=self.create_task('ocamlcmi',self.native_env) + task.set_inputs(mly_task.outputs[1]) + task.set_outputs(mly_task.outputs[1].change_ext('.cmi')) +def mli_hook(self,node): + task=self.create_task('ocamlcmi',self.native_env) + task.set_inputs(node) + task.set_outputs(node.change_ext('.cmi')) + self.mlitasks.append(task) +def mlc_hook(self,node): + task=self.create_task('ocamlcc',self.native_env) + task.set_inputs(node) + task.set_outputs(node.change_ext('.o')) + self.out_nodes+=task.outputs +def ml_hook(self,node): + if self.native_env: + task=self.create_task('ocamlx',self.native_env) + task.set_inputs(node) + task.set_outputs(node.change_ext('.cmx')) + task.obj=self + task.incpaths=self._bld_incpaths_lst + self.native_tasks.append(task) + if self.bytecode_env: + task=self.create_task('ocaml',self.bytecode_env) + task.set_inputs(node) + task.obj=self + task.bytecode=1 + task.incpaths=self._bld_incpaths_lst + task.set_outputs(node.change_ext('.cmo')) + self.bytecode_tasks.append(task) +def compile_may_start(self): + if not getattr(self.obj,'flag_deps',''): + self.obj.flag_deps=1 + if getattr(self,'bytecode',''):alltasks=self.obj.bytecode_tasks + else:alltasks=self.obj.native_tasks + self.signature() + tree=Build.bld + env=self.env + for node in self.inputs: + lst=tree.node_deps[self.unique_id()] + for depnode in lst: + for t in alltasks: + if t==self:continue + if depnode in t.inputs: + self.set_run_after(t) + delattr(self,'sign_all') + self.signature() + return Task.Task.runnable_status(self) +b=Task.simple_task_type +cls=b('ocamlx','${OCAMLOPT} ${OCAMLPATH} ${OCAMLFLAGS} ${INCLUDES} -c -o ${TGT} ${SRC}',color='GREEN') +cls.runnable_status=compile_may_start +cls.scan=scan +b=Task.simple_task_type +cls=b('ocaml','${OCAMLC} ${OCAMLPATH} ${OCAMLFLAGS} ${INCLUDES} -c -o ${TGT} ${SRC}',color='GREEN') +cls.runnable_status=compile_may_start +cls.scan=scan +b('ocamlcmi','${OCAMLC} ${OCAMLPATH} ${INCLUDES} -o ${TGT} -c ${SRC}',color='BLUE',before="ocaml ocamlcc") +b('ocamlcc','cd ${TGT[0].bld_dir(env)} && ${OCAMLOPT} ${OCAMLFLAGS} ${OCAMLPATH} ${INCLUDES} -c ${SRC[0].abspath(env)}',color='GREEN') +b('ocamllex','${OCAMLLEX} ${SRC} -o ${TGT}',color='BLUE',before="ocamlcmi ocaml ocamlcc") +b('ocamlyacc','${OCAMLYACC} -b ${TGT[0].bld_base(env)} ${SRC}',color='BLUE',before="ocamlcmi ocaml ocamlcc") +def link_may_start(self): + if not getattr(self,'order',''): + if getattr(self,'bytecode',0):alltasks=self.obj.bytecode_tasks + else:alltasks=self.obj.native_tasks + seen=[] + pendant=[]+alltasks + while pendant: + task=pendant.pop(0) + if task in seen:continue + for x in task.run_after: + if not x in seen: + pendant.append(task) + break + else: + seen.append(task) + self.inputs=[x.outputs[0]for x in seen] + self.order=1 + return Task.Task.runnable_status(self) +act=b('ocalink','${OCAMLC} -o ${TGT} ${INCLUDES} ${OCALINKFLAGS} ${SRC}',color='YELLOW',after="ocaml ocamlcc") +act.runnable_status=link_may_start +act=b('ocalinkx','${OCAMLOPT} -o ${TGT} ${INCLUDES} ${OCALINKFLAGS_OPT} ${SRC}',color='YELLOW',after="ocamlx ocamlcc") +act.runnable_status=link_may_start +def detect(conf): + opt=conf.find_program('ocamlopt',var='OCAMLOPT') + occ=conf.find_program('ocamlc',var='OCAMLC') + if(not opt)or(not occ): + conf.fatal('The objective caml compiler was not found:\ninstall it or make it available in your PATH') + conf.env['OCAMLC']=occ + conf.env['OCAMLOPT']=opt + conf.env['OCAMLLEX']=conf.find_program('ocamllex',var='OCAMLLEX') + conf.env['OCAMLYACC']=conf.find_program('ocamlyacc',var='OCAMLYACC') + conf.env['OCAMLFLAGS']='' + conf.env['OCAMLLIB']=Utils.cmd_output(conf.env['OCAMLC']+' -where').strip()+os.sep + conf.env['LIBPATH_OCAML']=Utils.cmd_output(conf.env['OCAMLC']+' -where').strip()+os.sep + conf.env['CPPPATH_OCAML']=Utils.cmd_output(conf.env['OCAMLC']+' -where').strip()+os.sep + conf.env['LIB_OCAML']='camlrun' + +taskgen(init_envs_ml) +feature('ocaml')(init_envs_ml) +taskgen(apply_incpaths_ml) +feature('ocaml')(apply_incpaths_ml) +before('apply_vars_ml')(apply_incpaths_ml) +after('init_envs_ml')(apply_incpaths_ml) +taskgen(apply_vars_ml) +feature('ocaml')(apply_vars_ml) +before('apply_core')(apply_vars_ml) +taskgen(apply_link_ml) +feature('ocaml')(apply_link_ml) +after('apply_core')(apply_link_ml) +extension(EXT_MLL)(mll_hook) +extension(EXT_MLY)(mly_hook) +extension(EXT_MLI)(mli_hook) +extension(EXT_MLC)(mlc_hook) +extension(EXT_ML)(ml_hook) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/flex.py @@ -0,0 +1,14 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import TaskGen +def decide_ext(self,node): + if'cxx'in self.features:return'.lex.cc' + else:return'.lex.c' +TaskGen.declare_chain(name='flex',action='${FLEX} -o${TGT} ${FLEXFLAGS} ${SRC}',ext_in='.l',decider=decide_ext,before='cc cxx',) +def detect(conf): + flex=conf.find_program('flex',var='FLEX') + if not flex:conf.fatal("flex was not found") + v=conf.env + v['FLEXFLAGS']='' + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/lua.py @@ -0,0 +1,13 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import TaskGen +TaskGen.declare_chain(name='luac',action='${LUAC} -s -o ${TGT} ${SRC}',ext_in='.lua',ext_out='.luac',reentrant=0,install='LUADIR',) +class lua_taskgen(TaskGen.task_gen): + def __init__(self): + TaskGen.task_gen.__init__(self) + self.chmod=0755 +def detect(conf): + luac=conf.find_program('luac',var='LUAC') + if not luac:conf.fatal('cannot find the compiler "luac"') + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/gob2.py @@ -0,0 +1,11 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import TaskGen +TaskGen.declare_chain(name='gob2',action='${GOB2} -o ${TGT[0].bld_dir(env)} ${GOB2FLAGS} ${SRC}',ext_in='.gob',ext_out='.c') +def detect(conf): + gob2=conf.find_program('gob2',var='GOB2') + if not gob2:conf.fatal('could not find the gob2 compiler') + conf.env['GOB2']=gob2 + conf.env['GOB2FLAGS']='' + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/misc.py @@ -0,0 +1,321 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import shutil,re,os,types +import TaskGen,Node,Task,Utils,Build +import pproc +from Logs import debug +def copy_func(tsk): + env=tsk.env + infile=tsk.inputs[0].abspath(env) + outfile=tsk.outputs[0].abspath(env) + try: + shutil.copy2(infile,outfile) + except OSError,IOError: + return 1 + else: + if tsk.chmod:os.chmod(outfile,tsk.chmod) + return 0 +def action_process_file_func(tsk): + if not tsk.fun:raise Utils.WafError('task must have a function attached to it for copy_func to work!') + return tsk.fun(tsk) +class cmd_taskgen(TaskGen.task_gen): + def __init__(self,type='none'): + TaskGen.task_gen.__init__(self) + self.type=type + self.fun=None + def apply(self): + if not self.fun:raise Utils.WafError('cmdobj needs a function!') + tsk=Task.TaskBase() + tsk.fun=self.fun + tsk.env=self.env + self.tasks.append(tsk) + tsk.install_path=self.install_path +class copy_taskgen(TaskGen.task_gen): + def __init__(self,type='none'): + TaskGen.task_gen.__init__(self) + self.source='' + self.target='' + self.chmod='' + self.fun=copy_func + self.default_install_path=0 + self.env=Build.bld.env.copy() + def apply(self): + lst=self.to_list(self.source) + for filename in lst: + node=self.path.find_resource(filename) + if not node:raise Utils.WafError('cannot find input file %s for processing'%filename) + target=self.target + if not target or len(lst)>1:target=node.name + newnode=self.path.find_or_declare(target) + tsk=self.create_task('copy') + tsk.set_inputs(node) + tsk.set_outputs(newnode) + tsk.fun=self.fun + tsk.chmod=self.chmod + if not tsk.env: + tsk.debug() + raise Utils.WafError('task without an environment') +def subst_func(tsk): + m4_re=re.compile('@(\w+)@',re.M) + env=tsk.env + infile=tsk.inputs[0].abspath(env) + outfile=tsk.outputs[0].abspath(env) + file=open(infile,'r') + code=file.read() + file.close() + code=code.replace('%','%%') + s=m4_re.sub(r'%(\1)s',code) + dict=tsk.dict + if not dict: + names=m4_re.findall(code) + for i in names: + if env[i]and type(env[i])is types.ListType: + dict[i]=" ".join(env[i]) + else:dict[i]=env[i] + file=open(outfile,'w') + file.write(s%dict) + file.close() + if tsk.chmod:os.chmod(outfile,tsk.chmod) + return 0 +class subst_taskgen(TaskGen.task_gen): + def __init__(self,type='none'): + TaskGen.task_gen.__init__(self) + self.fun=subst_func + self.chmod='' + self.dict={} + self.default_install_path=0 + def apply(self): + lst=self.to_list(self.source) + for filename in lst: + node=self.path.find_resource(filename) + if not node:raise Utils.WafError('cannot find input file %s for processing'%filename) + newnode=node.change_ext('') + if self.dict and not self.env['DICT_HASH']: + self.env=self.env.copy() + self.env['DICT_HASH']=hash(str(self.dict)) + tsk=self.create_task('copy') + tsk.set_inputs(node) + tsk.set_outputs(newnode) + tsk.fun=self.fun + tsk.dict=self.dict + tsk.dep_vars=['DICT_HASH'] + tsk.install_path=self.install_path + tsk.chmod=self.chmod + if not tsk.env: + tsk.debug() + raise Utils.WafError('task without an environment') +class CmdArg(object): + pass +class CmdFileArg(CmdArg): + def __init__(self,file_name,template=None): + CmdArg.__init__(self) + self.file_name=file_name + if template is None: + self.template='%s' + else: + self.template=template + self.node=None +class CmdInputFileArg(CmdFileArg): + def find_node(self,base_path): + assert isinstance(base_path,Node.Node) + self.node=base_path.find_resource(self.file_name) + if self.node is None: + raise Utils.WafError("Input file %s not found in "%(self.file_name,base_path)) + def get_path(self,env,absolute): + if absolute: + return self.template%self.node.abspath(env) + else: + return self.template%self.node.srcpath(env) +class CmdOutputFileArg(CmdFileArg): + def find_node(self,base_path): + assert isinstance(base_path,Node.Node) + self.node=base_path.find_or_declare(self.file_name) + if self.node is None: + raise Utils.WafError("Output file %s not found in "%(self.file_name,base_path)) + def get_path(self,env,absolute): + if absolute: + return self.template%self.node.abspath(env) + else: + return self.template%self.node.bldpath(env) +class CmdDirArg(CmdArg): + def __init__(self,dir_name,template=None): + CmdArg.__init__(self) + self.dir_name=dir_name + self.node=None + if template is None: + self.template='%s' + else: + self.template=template + def find_node(self,base_path): + assert isinstance(base_path,Node.Node) + self.node=base_path.find_dir(self.dir_name) + if self.node is None: + raise Utils.WafError("Directory %s not found in "%(self.dir_name,base_path)) +class CmdInputDirArg(CmdDirArg): + def get_path(self,dummy_env,dummy_absolute): + return self.template%(self.node.abspath(),) +class CmdOutputDirArg(CmdDirArg): + def get_path(self,env,dummy_absolute): + return self.template%(self.node.abspath(env),) +class command_output(Task.Task): + m_color="BLUE" + def __init__(self,env,command,command_node,command_args,stdin,stdout,cwd,os_env,stderr): + Task.Task.__init__(self,env,normal=1) + assert isinstance(command,(str,Node.Node)) + self.command=command + self.command_args=command_args + self.stdin=stdin + self.stdout=stdout + self.cwd=cwd + self.os_env=os_env + self.stderr=stderr + if command_node is not None:self.dep_nodes=[command_node] + self.dep_vars=[] + def run(self): + task=self + assert len(task.inputs)>0 + def input_path(node,template): + if task.cwd is None: + return template%node.bldpath(task.env) + else: + return template%node.abspath() + def output_path(node,template): + fun=node.abspath + if task.cwd is None:fun=node.bldpath + return template%fun(task.env) + if isinstance(task.command,Node.Node): + argv=[input_path(task.command,'%s')] + else: + argv=[task.command] + for arg in task.command_args: + if isinstance(arg,str): + argv.append(arg) + else: + assert isinstance(arg,CmdArg) + argv.append(arg.get_path(task.env,(task.cwd is not None))) + if task.stdin: + stdin=file(input_path(task.stdin,'%s')) + else: + stdin=None + if task.stdout: + stdout=file(output_path(task.stdout,'%s'),"w") + else: + stdout=None + if task.stderr: + stderr=file(output_path(task.stderr,'%s'),"w") + else: + stderr=None + if task.cwd is None: + cwd=('None (actually %r)'%os.getcwd()) + else: + cwd=repr(task.cwd) + debug("command-output: cwd=%s, stdin=%r, stdout=%r, argv=%r"%(cwd,stdin,stdout,argv)) + if task.os_env is None: + os_env=os.environ + else: + os_env=task.os_env + command=pproc.Popen(argv,stdin=stdin,stdout=stdout,stderr=stderr,cwd=task.cwd,env=os_env) + return command.wait() +class cmd_output_taskgen(TaskGen.task_gen): + def __init__(self,*k): + TaskGen.task_gen.__init__(self,*k) + self.stdin=None + self.stdout=None + self.stderr=None + self.command=None + self.command_is_external=False + self.argv=[] + self.dependencies=[] + self.dep_vars=[] + self.hidden_inputs=[] + self.hidden_outputs=[] + self.cwd=None + self.os_env=None + def apply(self): + if self.command is None: + raise Utils.WafError("command-output missing command") + if self.command_is_external: + cmd=self.command + cmd_node=None + else: + cmd_node=self.path.find_resource(self.command) + assert cmd_node is not None,('''Could not find command '%s' in source tree. +Hint: if this is an external command, +use command_is_external=True''')%(self.command,) + cmd=cmd_node + if self.cwd is None: + cwd=None + else: + assert isinstance(cwd,CmdDirArg) + self.cwd.find_node(self.path) + args=[] + inputs=[] + outputs=[] + for arg in self.argv: + if isinstance(arg,CmdArg): + arg.find_node(self.path) + if isinstance(arg,CmdInputFileArg): + inputs.append(arg.node) + if isinstance(arg,CmdOutputFileArg): + outputs.append(arg.node) + if self.stdout is None: + stdout=None + else: + assert isinstance(self.stdout,basestring) + stdout=self.path.find_or_declare(self.stdout) + if stdout is None: + raise Utils.WafError("File %s not found"%(self.stdout,)) + outputs.append(stdout) + if self.stderr is None: + stderr=None + else: + assert isinstance(self.stderr,basestring) + stderr=self.path.find_or_declare(self.stderr) + if stderr is None: + Params.fatal("File %s not found"%(self.stderr,)) + outputs.append(stderr) + if self.stdin is None: + stdin=None + else: + assert isinstance(self.stdin,basestring) + stdin=self.path.find_resource(self.stdin) + if stdin is None: + raise Utils.WafError("File %s not found"%(self.stdin,)) + inputs.append(stdin) + for hidden_input in self.to_list(self.hidden_inputs): + node=self.path.find_resource(hidden_input) + if node is None: + raise Utils.WafError("File %s not found in dir %s"%(hidden_input,self.path)) + inputs.append(node) + for hidden_output in self.to_list(self.hidden_outputs): + node=self.path.find_or_declare(hidden_output) + if node is None: + raise Utils.WafError("File %s not found in dir %s"%(hidden_output,self.path)) + outputs.append(node) + if not inputs: + raise Utils.WafError("command-output objects must have at least one input file") + if not outputs: + raise Utils.WafError("command-output objects must have at least one output file") + task=command_output(self.env,cmd,cmd_node,self.argv,stdin,stdout,cwd,self.os_env,stderr) + Utils.copy_attrs(self,task,'before after ext_in ext_out',only_if_set=True) + self.tasks.append(task) + task.set_inputs(inputs) + task.set_outputs(outputs) + task.dep_vars=self.to_list(self.dep_vars) + for dep in self.dependencies: + assert dep is not self + dep.post() + for dep_task in dep.tasks: + task.set_run_after(dep_task) + def input_file(self,file_name,template='%s'): + return CmdInputFileArg(file_name,template) + def output_file(self,file_name,template='%s'): + return CmdOutputFileArg(file_name,template) + def input_dir(self,dir_name,template=None): + return CmdInputDirArg(dir_name) + def output_dir(self,dir_name,template=None): + return CmdOutputDirArg(dir_name,template) +Task.task_type_from_func('copy',vars=[],func=action_process_file_func) +TaskGen.task_gen.classes['command-output']=cmd_output_taskgen + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/batched_cc.py @@ -0,0 +1,75 @@ +#! /usr/bin/env python +# encoding: utf-8 + +EXT_C=['.c','.cc','.cpp','.cxx'] +import shutil,os +import TaskGen,Task,ccroot,Build +from TaskGen import extension +from Constants import* +class TaskMaster(Task.Task): + def __init__(self,action_name,env,normal=1,master=None): + Task.Task.__init__(self,env,normal=normal) + self.slaves=[] + self.inputs2=[] + self.outputs2=[] + def add_slave(self,slave): + self.slaves.append(slave) + self.set_run_after(slave) + def runnable_status(self): + for t in self.run_after: + if not t.hasrun:return ASK_LATER + for t in self.slaves: + self.inputs.append(t.inputs[0]) + self.outputs.append(t.outputs[0]) + if t.must_run: + self.inputs2.append(t.inputs[0]) + self.outputs2.append(t.outputs[0]) + return Task.Task.runnable_status(self) + def run(self): + tmpinputs=self.inputs + self.inputs=self.inputs2 + tmpoutputs=self.outputs + self.outputs=self.outputs2 + ret=self.action.run(self) + env=self.env + rootdir=Build.bld.srcnode.abspath(env) + for i in self.outputs: + name=i.name + if name[-1]=="s":name=name[:-1] + shutil.move(name,i.bldpath(env)) + self.inputs=tmpinputs + self.outputs=tmpoutputs + return ret +class TaskSlave(Task.Task): + def __init__(self,action_name,env,normal=1,master=None): + Task.Task.__init__(self,env,normal) + self.master=master + def prepare(self): + self.display="* skipping "+self.inputs[0].name + def update_stat(self): + self.executed=1 + def runnable_status(self): + self.must_run=Task.Task.must_run(self) + return self.must_run + def run(self): + return 0 + def can_retrieve_cache(self,sig): + return None +def create_task_cxx_new(self,node): + try: + mm=self.mastertask + except AttributeError: + mm=TaskMaster("all_"+self.type_initials,self.env) + self.mastertask=mm + task=TaskSlave(self.type_initials,self.env,40,master=mm) + self.tasks.append(task) + mm.add_slave(task) + task.set_inputs(node) + task.set_outputs(node.change_ext('.o')) + self.compiled_tasks.append(task) +cc_str='${CC} ${CCFLAGS} ${CPPFLAGS} ${_CCINCFLAGS} ${_CCDEFFLAGS} -c ${SRC}' +Task.simple_task_type('all_cc',cc_str,'GREEN') +cpp_str='${CXX} ${CXXFLAGS} ${CPPFLAGS} ${_CXXINCFLAGS} ${_CXXDEFFLAGS} -c ${SRC}' +Task.simple_task_type('all_cpp',cpp_str,color='GREEN') + +extension(EXT_C)(create_task_cxx_new) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/cs.py @@ -0,0 +1,48 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import TaskGen,Utils,Task +from Logs import error +g_types_lst=['program','library'] +class cs_taskgen(TaskGen.task_gen): + def __init__(self,*k): + TaskGen.task_gen.__init__(self,*k) + self.type=k[1] + self.source='' + self.target='' + self.flags='' + self.assemblies='' + self.resources='' + self.uselib='' + self._flag_vars=['FLAGS','ASSEMBLIES'] + if not self.type in g_types_lst: + error('type for csobj is undefined '+type) + type='program' + def apply(self): + self.apply_uselib() + assemblies_flags=[] + for i in self.to_list(self.assemblies)+self.env['ASSEMBLIES']: + assemblies_flags+='/r:'+i + self.env['_ASSEMBLIES']+=assemblies_flags + for i in self.to_list(self.resources): + self.env['_RESOURCES'].append('/resource:'+i) + self.env['_FLAGS']+=self.to_list(self.flags)+self.env['FLAGS'] + curnode=self.path + nodes=[] + for i in self.to_list(self.source): + nodes.append(curnode.find_resource(i)) + task=self.create_task('mcs') + task.inputs=nodes + task.set_outputs(self.path.find_or_declare(self.target)) + def apply_uselib(self): + if not self.uselib: + return + for var in self.to_list(self.uselib): + for v in self._flag_vars: + val=self.env[v+'_'+var] + if val:self.env.append_value(v,val) +Task.simple_task_type('mcs','${MCS} ${SRC} /out:${TGT} ${_FLAGS} ${_ASSEMBLIES} ${_RESOURCES}',color='YELLOW') +def detect(conf): + mcs=conf.find_program('mcs',var='MCS') + if not mcs:mcs=conf.find_program('gmcs',var='MCS') + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/dmd.py @@ -0,0 +1,49 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import sys +import ar +def find_dmd(conf): + v=conf.env + d_compiler=None + if v['D_COMPILER']: + d_compiler=v['D_COMPILER'] + if not d_compiler:d_compiler=conf.find_program('dmd',var='D_COMPILER') + if not d_compiler:return 0 + v['D_COMPILER']=d_compiler +def common_flags(conf): + v=conf.env + v['DFLAGS']={'gdc':[],'dmd':['-version=Posix']} + v['D_SRC_F']='' + v['D_TGT_F']='-c -of' + v['DPATH_ST']='-I%s' + v['D_LINKER']=v['D_COMPILER'] + v['DLNK_SRC_F']='' + v['DLNK_TGT_F']='-of' + v['DLIB_ST']='-L-l%s' + v['DLIBPATH_ST']='-L-L%s' + v['DFLAGS_OPTIMIZED']=['-O'] + v['DFLAGS_DEBUG']=['-g','-debug'] + v['DFLAGS_ULTRADEBUG']=['-g','-debug'] + v['DLINKFLAGS']=['-quiet'] + v['D_shlib_DFLAGS']=['-fPIC'] + v['D_shlib_LINKFLAGS']=['-L-shared'] + v['DHEADER_ext']='.di' + v['D_HDR_F']='-H -Hf' + if sys.platform=="win32": + v['D_program_PATTERN']='%s.exe' + v['D_shlib_PATTERN']='lib%s.dll' + v['D_staticlib_PATTERN']='lib%s.a' + else: + v['D_program_PATTERN']='%s' + v['D_shlib_PATTERN']='lib%s.so' + v['D_staticlib_PATTERN']='lib%s.a' +def detect(conf): + v=conf.env + find_dmd(conf) + ar.find_ar(conf) + conf.check_tool('d') + common_flags(conf) +def set_options(opt): + pass + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/python.py @@ -0,0 +1,275 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys +import TaskGen,Utils,Utils,Runner,Options,Build +from Logs import debug,warn +from TaskGen import extension,taskgen,before,after,feature +from Configure import conf +import pproc +EXT_PY=['.py'] +def init_pyext(self): + self.default_install_path='${PYTHONDIR}' + self.uselib=self.to_list(self.uselib) + if not'PYEXT'in self.uselib: + self.uselib.append('PYEXT') + self.env['MACBUNDLE']=True +def pyext_shlib_ext(self): + self.env['shlib_PATTERN']=self.env['pyext_PATTERN'] +def init_pyembed(self): + self.uselib=self.to_list(self.uselib) + if not'PYEMBED'in self.uselib: + self.uselib.append('PYEMBED') +def process_py(self,node): + pass +class py_taskgen(TaskGen.task_gen): + def __init__(self,env=None): + TaskGen.task_gen.__init__(self) + self.default_install_path='${PYTHONDIR}' + self.chmod=0644 + def install(self): + files_to_install=[] + for filename in self.to_list(self.source): + node=self.path.find_resource(filename) + if node is not None: + files_to_install.append(node.abspath()) + else: + node=self.path.find_or_declare(filename) + if node is None: + raise Utils.WafError("Cannot install file %s: not found in %s"%(filename,self.path)) + else: + files_to_install.append(node.abspath(self.env)) + installed_files=Build.bld.install_files(self.install_path,files_to_install,self.env,self.chmod) + if not installed_files: + return + if Options.commands['uninstall']: + print"* removing byte compiled python files" + for fname in installed_files: + try: + os.remove(fname+'c') + except OSError: + pass + try: + os.remove(fname+'o') + except OSError: + pass + else: + if self.env['PYC']or self.env['PYO']: + print"* byte compiling python files" + if self.env['PYC']: + program=(""" +import sys, py_compile +for pyfile in sys.argv[1:]: + py_compile.compile(pyfile, pyfile + 'c') +""") + argv=[self.env['PYTHON'],"-c",program] + argv.extend(installed_files) + retval=pproc.Popen(argv).wait() + if retval: + raise Utils.WafError("bytecode compilation failed") + if self.env['PYO']: + program=(""" +import sys, py_compile +for pyfile in sys.argv[1:]: + py_compile.compile(pyfile, pyfile + 'o') +""") + argv=[self.env['PYTHON'],self.env['PYFLAGS_OPT'],"-c",program] + argv.extend(installed_files) + retval=pproc.Popen(argv).wait() + if retval: + raise Utils.WafError("bytecode compilation failed") +def _get_python_variables(python_exe,variables,imports=['import sys']): + program=list(imports) + program.append('') + for v in variables: + program.append("print repr(%s)"%v) + proc=pproc.Popen([python_exe,"-c",'\n'.join(program)],stdout=pproc.PIPE) + output=proc.communicate()[0].split("\n") + if proc.returncode: + if Logs.verbose: + warn("Python program to extract python configuration variables failed:\n%s"%'\n'.join(["line %03i: %s"%(lineno+1,line)for lineno,line in enumerate(program)])) + raise ValueError + return_values=[] + for s in output: + s=s.strip() + if not s: + continue + if s=='None': + return_values.append(None) + elif s[0]=="'"and s[-1]=="'": + return_values.append(s[1:-1]) + elif s[0].isdigit(): + return_values.append(int(s)) + else:break + return return_values +def check_python_headers(conf): + env=conf.env + python=env['PYTHON'] + assert python,("python is %r !"%(python,)) + import checks + if checks.detect_platform(None)=='darwin': + conf.check_tool('osx') + try: + v='prefix SO SYSLIBS SHLIBS LIBDIR LIBPL INCLUDEPY Py_ENABLE_SHARED'.split() + (python_prefix,python_SO,python_SYSLIBS,python_SHLIBS,python_LIBDIR,python_LIBPL,INCLUDEPY,Py_ENABLE_SHARED)=_get_python_variables(python,["get_config_var('%s')"%x for x in v],['from distutils.sysconfig import get_config_var']) + except ValueError: + conf.fatal("Python development headers not found (-v for details).") + Runner.print_log("""Configuration returned from %r: +python_prefix = %r +python_SO = %r +python_SYSLIBS = %r +python_SHLIBS = %r +python_LIBDIR = %r +python_LIBPL = %r +INCLUDEPY = %r +Py_ENABLE_SHARED = %r +"""%(python,python_prefix,python_SO,python_SYSLIBS,python_SHLIBS,python_LIBDIR,python_LIBPL,INCLUDEPY,Py_ENABLE_SHARED)) + env['pyext_PATTERN']='%s'+python_SO + if python_SYSLIBS is not None: + for lib in python_SYSLIBS.split(): + if lib.startswith('-l'): + lib=lib[2:] + env.append_value('LIB_PYEMBED',lib) + if python_SHLIBS is not None: + for lib in python_SHLIBS.split(): + if lib.startswith('-l'): + lib=lib[2:] + env.append_value('LIB_PYEMBED',lib) + lib=conf.create_library_configurator() + lib.name='python'+env['PYTHON_VERSION'] + lib.uselib='PYEMBED' + lib.code=''' +#ifdef __cplusplus +extern "C" { +#endif + void Py_Initialize(void); + void Py_Finalize(void); +#ifdef __cplusplus +} +#endif +int main(int argc, char *argv[]) { Py_Initialize(); Py_Finalize(); return 0; } +''' + if python_LIBDIR is not None: + lib.path=[python_LIBDIR] + result=lib.run() + else: + result=0 + if not result: + if python_LIBPL is not None: + lib.path=[python_LIBPL] + result=lib.run() + else: + result=0 + if not result: + lib.path=[os.path.join(python_prefix,"libs")] + lib.name='python'+env['PYTHON_VERSION'].replace('.','') + result=lib.run() + if result: + env['LIBPATH_PYEMBED']=lib.path + env.append_value('LIB_PYEMBED',lib.name) + if(sys.platform=='win32'or sys.platform.startswith('os2')or sys.platform=='darwin'or Py_ENABLE_SHARED): + env['LIBPATH_PYEXT']=env['LIBPATH_PYEMBED'] + env['LIB_PYEXT']=env['LIB_PYEMBED'] + python_config=conf.find_program('python%s-config'%('.'.join(env['PYTHON_VERSION'].split('.')[:2])),var='PYTHON_CONFIG') + if python_config: + includes=[] + for incstr in os.popen("%s %s --includes"%(python,python_config)).readline().strip().split(): + if(incstr.startswith('-I')or incstr.startswith('/I')): + incstr=incstr[2:] + if incstr not in includes: + includes.append(incstr) + env['CPPPATH_PYEXT']=list(includes) + env['CPPPATH_PYEMBED']=list(includes) + else: + env['CPPPATH_PYEXT']=[INCLUDEPY] + env['CPPPATH_PYEMBED']=[INCLUDEPY] + if env['CC']: + version=os.popen("%s --version"%env['CC']).readline() + if'(GCC)'in version: + env.append_value('CCFLAGS_PYEMBED','-fno-strict-aliasing') + env.append_value('CCFLAGS_PYEXT','-fno-strict-aliasing') + if env['CXX']: + version=os.popen("%s --version"%env['CXX']).readline() + if'(GCC)'in version: + env.append_value('CXXFLAGS_PYEMBED','-fno-strict-aliasing') + env.append_value('CXXFLAGS_PYEXT','-fno-strict-aliasing') + header=conf.create_header_configurator() + header.name='Python.h' + header.define='HAVE_PYTHON_H' + header.uselib='PYEXT' + header.code="#include \nint main(int argc, char *argv[]) { Py_Initialize(); Py_Finalize(); return 0; }" + result=header.run() + if not result: + conf.fatal("Python development headers not found.") +def check_python_version(conf,minver=None): + assert minver is None or isinstance(minver,tuple) + python=conf.env['PYTHON'] + assert python,("python is %r !"%(python,)) + cmd=[python,"-c","import sys\nfor x in sys.version_info: print str(x)"] + debug('python: Running python command %r'%cmd) + proc=pproc.Popen(cmd,stdout=pproc.PIPE) + lines=proc.communicate()[0].split() + assert len(lines)==5,"found %i lines, expected 5: %r"%(len(lines),lines) + pyver_tuple=(int(lines[0]),int(lines[1]),int(lines[2]),lines[3],int(lines[4])) + result=(minver is None)or(pyver_tuple>=minver) + if result: + pyver='.'.join([str(x)for x in pyver_tuple[:2]]) + conf.env['PYTHON_VERSION']=pyver + if'PYTHONDIR'in os.environ: + pydir=os.environ['PYTHONDIR'] + else: + if sys.platform=='win32': + (python_LIBDEST,)=_get_python_variables(python,["get_config_var('LIBDEST')"],['from distutils.sysconfig import get_config_var']) + else: + python_LIBDEST=None + if python_LIBDEST is None: + if conf.env['LIBDIR']: + python_LIBDEST=os.path.join(conf.env['LIBDIR'],"python"+pyver) + else: + python_LIBDEST=os.path.join(conf.env['PREFIX'],"lib","python"+pyver) + pydir=os.path.join(python_LIBDEST,"site-packages") + if hasattr(conf,'define'): + conf.define('PYTHONDIR',pydir) + conf.env['PYTHONDIR']=pydir + pyver_full='.'.join(map(str,pyver_tuple[:3])) + if minver is None: + conf.check_message_custom('Python version','',pyver_full) + else: + minver_str='.'.join(map(str,minver)) + conf.check_message('Python version',">= %s"%(minver_str,),result,option=pyver_full) + if not result: + conf.fatal("Python too old.") +def check_python_module(conf,module_name): + result=not pproc.Popen([conf.env['PYTHON'],"-c","import %s"%module_name],stderr=pproc.PIPE,stdout=pproc.PIPE).wait() + conf.check_message('Python module',module_name,result) + if not result: + conf.fatal("Python module not found.") +def detect(conf): + python=conf.find_program('python',var='PYTHON') + if not python:return + v=conf.env + v['PYCMD']='"import sys, py_compile;py_compile.compile(sys.argv[1], sys.argv[2])"' + v['PYFLAGS']='' + v['PYFLAGS_OPT']='-O' + v['PYC']=getattr(Options.options,'pyc',1) + v['PYO']=getattr(Options.options,'pyo',1) +def set_options(opt): + opt.add_option('--nopyc',action='store_false',default=1,help='no pyc files (configuration)',dest='pyc') + opt.add_option('--nopyo',action='store_false',default=1,help='no pyo files (configuration)',dest='pyo') + +taskgen(init_pyext) +before('apply_incpaths')(init_pyext) +feature('pyext')(init_pyext) +before('apply_bundle')(init_pyext) +taskgen(pyext_shlib_ext) +before('apply_link')(pyext_shlib_ext) +before('apply_lib_vars')(pyext_shlib_ext) +after('apply_bundle')(pyext_shlib_ext) +feature('pyext')(pyext_shlib_ext) +taskgen(init_pyembed) +before('apply_incpaths')(init_pyembed) +feature('pyembed')(init_pyembed) +extension(EXT_PY)(process_py) +conf(check_python_headers) +conf(check_python_version) +conf(check_python_module) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/preproc.py @@ -0,0 +1,477 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import re,sys,os,string,types +if __name__=='__main__': + sys.path=['.','..']+sys.path +import Logs,Build,Utils +from Logs import debug,error +import traceback +class PreprocError(Utils.WafError): + pass +go_absolute=0 +standard_includes=['/usr/include'] +if sys.platform=="win32": + standard_includes=[] +use_trigraphs=0 +'apply the trigraph rules first' +strict_quotes=0 +g_optrans={'not':'!','and':'&&','bitand':'&','and_eq':'&=','or':'||','bitor':'|','or_eq':'|=','xor':'^','xor_eq':'^=','compl':'~',} +re_lines=re.compile('^[ \t]*(#|%:)[ \t]*(ifdef|ifndef|if|else|elif|endif|include|import|define|undef|pragma)[ \t]*(.*)\r*$',re.IGNORECASE|re.MULTILINE) +re_mac=re.compile("^[a-zA-Z_]\w*") +re_fun=re.compile('^[a-zA-Z_][a-zA-Z0-9_]*[(]') +re_pragma_once=re.compile('^\s*once\s*',re.IGNORECASE) +re_nl=re.compile('\\\\\r*\n',re.MULTILINE) +re_cpp=re.compile(r"""(/\*[^*]*\*+([^/*][^*]*\*+)*/)|//[^\n]*|("(\\.|[^"\\])*"|'(\\.|[^'\\])*'|.[^/"'\\]*)""",re.MULTILINE) +trig_def=[('??'+a,b)for a,b in zip("=-/!'()<>",r'#~\|^[]{}')] +chr_esc={'0':0,'a':7,'b':8,'t':9,'n':10,'f':11,'v':12,'r':13,'\\':92,"'":39} +NUM='i' +OP='O' +IDENT='T' +STR='s' +CHAR='c' +tok_types=[NUM,STR,IDENT,OP] +exp_types=[r"""0[xX](?P[a-fA-F0-9]+)(?P[uUlL]*)|L*?'(?P(\\.|[^\\'])+)'|(?P\d+)[Ee](?P[+-]*?\d+)(?P[fFlL]*)|(?P\d*\.\d+)([Ee](?P[+-]*?\d+))?(?P[fFlL]*)|(?P\d+\.\d*)([Ee](?P[+-]*?\d+))?(?P[fFlL]*)|(?P0*)(?P\d+)(?P[uUlL]*)""",r'L?"([^"\\]|\\.)*"',r'[a-zA-Z_]\w*',r'%:%:|<<=|>>=|\.\.\.|<<|<%|<:|<=|>>|>=|\+\+|\+=|--|->|-=|\*=|/=|%:|%=|%>|==|&&|&=|\|\||\|=|\^=|:>|!=|##|[\(\)\{\}\[\]<>\?\|\^\*\+&=:!#;,%/\-\?\~\.]',] +re_clexer=re.compile('|'.join(["(?P<%s>%s)"%(name,part)for name,part in zip(tok_types,exp_types)]),re.M) +accepted='a' +ignored='i' +undefined='u' +skipped='s' +def repl(m): + s=m.group(1) + if s is not None:return' ' + s=m.group(3) + if s is None:return'' + return s +def filter_comments(filename): + f=open(filename,"r") + code=f.read() + f.close() + if use_trigraphs: + for(a,b)in trig_def:code=code.split(a).join(b) + code=re_nl.sub('',code) + code=re_cpp.sub(repl,code) + return[(m.group(2),m.group(3))for m in re.finditer(re_lines,code)] +prec={} +ops=['* / %','+ -','<< >>','< <= >= >','== !=','& | ^','&& ||',','] +for x in range(len(ops)): + syms=ops[x] + for u in syms.split(): + prec[u]=x +def reduce_nums(val_1,val_2,val_op): + try:a=0+val_1 + except TypeError:a=int(val_1) + try:b=0+val_2 + except TypeError:b=int(val_2) + d=val_op + if d=='%':c=a%b + elif d=='+':c=a+b + elif d=='-':c=a-b + elif d=='*':c=a*b + elif d=='/':c=a/b + elif d=='^':c=a^b + elif d=='|':c=a|b + elif d=='||':c=int(a or b) + elif d=='&':c=a&b + elif d=='&&':c=int(a and b) + elif d=='==':c=int(a==b) + elif d=='!=':c=int(a!=b) + elif d=='<=':c=int(a<=b) + elif d=='<':c=int(a':c=int(a>b) + elif d=='>=':c=int(a>=b) + elif d=='^':c=int(a^b) + elif d=='<<':c=a<>':c=a>>b + else:c=0 + return c +def get_expr(lst,defs,ban): + if not lst:return([],[],[]) + (p,v)=lst[0] + if p==NUM: + return(p,v,lst[1:]) + elif p==STR: + try: + (p2,v2)=lst[1] + if p2==STR:return(p,v+v2,lst[2:]) + except IndexError:pass + return(p,v,lst[1:]) + elif p==OP: + if v in['+','-','!','~','#']: + (p2,v2,lst2)=get_expr(lst[1:],defs,ban) + if v=='#': + if p2!=IDENT:raise PreprocError,"ident expected %s"%str(lst) + return get_expr([(STR,v2)]+lst2,defs,ban) + if p2!=NUM:raise PreprocError,"num expected %s"%str(lst) + if v=='+':return(p2,v2,lst2) + elif v=='-':return(p2,-int(v2),lst2) + elif v=='!':return(p2,int(not int(v2)),lst2) + elif v=='~':return(p2,~int(v2),lst2) + return(p2,v2,lst2) + elif v=='(': + count_par=0 + i=0 + for _,v in lst: + if v==')': + count_par-=1 + if count_par==0:break + elif v=='(':count_par+=1 + i+=1 + else: + raise PreprocError,"rparen expected %s"%str(lst) + ret=process_tokens(lst[1:i],defs,ban) + if len(ret)==1: + (p,v)=ret[0] + return(p,v,lst[i+1:]) + else: + raise PreprocError,"cannot reduce %s"%str(lst) + elif p==IDENT: + if len(lst)>1: + (p2,v2)=lst[1] + if v2=="##": + (p3,v3)=lst[2] + if p3!=IDENT and p3!=NUM and p3!=OP: + raise PreprocError,"%s: ident expected after '##'"%str(lst) + return get_expr([(p,v+v3)]+lst[3:],defs,ban) + if v.lower()=='defined': + (p2,v2)=lst[1] + off=2 + if v2=='(': + (p3,v3)=lst[2] + if p3!=IDENT:raise PreprocError,'expected an identifier after a "defined("' + (p2,v2)=lst[3] + if v2!=')':raise PreprocError,'expected a ")" after a "defined(x"' + off=4 + elif p2!=IDENT: + raise PreprocError,'expected a "(" or an identifier after a defined' + x=0 + if v2 in defs:x=1 + return(NUM,x,lst[off:]) + elif not v in defs or v in ban: + if"waf_include"in ban:return(p,v,lst[1:]) + else:return(NUM,0,lst[1:]) + if type(defs[v])is types.StringType: + v,k=extract_macro(defs[v]) + defs[v]=k + macro_def=defs[v] + if not macro_def[0]: + lst=macro_def[1]+lst[1:] + return get_expr(lst,defs,ban) + else: + params=[] + i=1 + p2,v2=lst[i] + if p2!=OP or v2!='(':raise PreprocError,"invalid function call '%s'"%v + one_param=[] + count_paren=0 + try: + while 1: + i+=1 + p2,v2=lst[i] + if p2==OP and count_paren==0: + if v2=='(': + one_param.append((p2,v2)) + count_paren+=1 + elif v2==')': + if one_param:params.append(one_param) + lst=lst[i+1:] + break + elif v2==',': + if not one_param:raise PreprocError,"empty param in funcall %s"%p + params.append(one_param) + one_param=[] + else: + one_param.append((p2,v2)) + else: + one_param.append((p2,v2)) + if v2=='(':count_paren+=1 + elif v2==')':count_paren-=1 + except IndexError,e: + raise + accu=[] + table=macro_def[0] + for p2,v2 in macro_def[1]: + if p2==IDENT and v2 in table:accu+=params[table[v2]] + else: + if v2=='__VA_ARGS__': + va_toks=[] + st=len(macro_def[0]) + pt=len(params) + for x in params[pt-st+1:]: + va_toks.extend(x) + va_toks.append((OP,',')) + if va_toks:va_toks.pop() + if len(accu)>1: + (p3,v3)=accu[-1] + (p4,v4)=accu[-2] + if v3=='##': + accu.pop() + if v4==','and pt30000:raise PreprocError,"recursion limit exceeded, bailing out" + pc=self.parse_cache + debug('preproc: reading file %r'%filepath) + try: + lns=pc[filepath] + except KeyError: + pass + else: + self.lines=lns+self.lines + return + try: + lines=filter_comments(filepath) + pc[filepath]=lines + self.lines=lines+self.lines + except IOError: + raise PreprocError,"could not read the file %s"%filepath + except Exception: + if Logs.verbose>0: + error("parsing %s failed"%filepath) + traceback.print_exc() + def start(self,node,env): + debug('preproc: scanning %s (in %s)'%(node.name,node.parent.name)) + self.env=env + variant=node.variant(env) + self.addlines(node.abspath(env)) + if env['DEFLINES']: + self.lines=[('define',x)for x in env['DEFLINES']]+self.lines + while self.lines: + (type,line)=self.lines.pop(0) + try: + self.process_line(type,line) + except Exception,ex: + if Logs.verbose: + error("line parsing failed (%s): %s"%(str(ex),line)) + traceback.print_exc() + def process_line(self,token,line): + ve=Logs.verbose + if ve:debug('preproc: line is %s - %s state is %s'%(token,line,self.state)) + state=self.state + if token in['ifdef','ifndef','if']: + state.append(undefined) + elif token=='endif': + state.pop() + if not token in['else','elif','endif']: + if skipped in self.state or ignored in self.state: + return + if token=='if': + ret=eval_macro(tokenize(line),self.defs) + if ret:state[-1]=accepted + else:state[-1]=ignored + elif token=='ifdef': + m=re_mac.search(line) + if m and m.group(0)in self.defs:state[-1]=accepted + else:state[-1]=ignored + elif token=='ifndef': + m=re_mac.search(line) + if m and m.group(0)in self.defs:state[-1]=ignored + else:state[-1]=accepted + elif token=='include'or token=='import': + (type,inc)=extract_include(line,self.defs) + if inc in self.ban_includes:return + if token=='import':self.ban_includes.append(inc) + if ve:debug('preproc: include found %s (%s) '%(inc,type)) + if type=='"'or not strict_quotes: + if not inc in self.deps: + self.deps.append(inc) + self.tryfind(inc) + elif token=='elif': + if state[-1]==accepted: + state[-1]=skipped + elif state[-1]==ignored: + if eval_macro(tokenize(line),self.defs): + state[-1]=accepted + elif token=='else': + if state[-1]==accepted:state[-1]=skipped + elif state[-1]==ignored:state[-1]=accepted + elif token=='define': + m=re_mac.search(line) + if m: + name=m.group(0) + if ve:debug('preproc: define %s %s'%(name,line)) + self.defs[name]=line + else: + raise PreprocError,"invalid define line %s"%line + elif token=='undef': + m=re_mac.search(line) + if m and m.group(0)in self.defs: + self.defs.__delitem__(m.group(0)) + elif token=='pragma': + if re_pragma_once.search(line.lower()): + self.ban_includes.append(self.curfile) +def extract_macro(txt): + t=tokenize(txt) + if re_fun.search(txt): + p,name=t[0] + p,v=t[1] + if p!=OP:raise PreprocError,"expected open parenthesis" + i=1 + pindex=0 + params={} + wantident=1 + while 1: + i+=1 + p,v=t[i] + if wantident: + if p==IDENT: + params[v]=pindex + pindex+=1 + elif v=='...': + pass + else: + raise PreprocError,"expected ident" + else: + if v==',': + pass + elif v==')': + break + elif v=='...': + raise PreprocError,"not implemented" + wantident=not wantident + return(name,[params,t[i+1:]]) + else: + (p,v)=t[0] + return(v,[[],t[1:]]) +re_include=re.compile('^\s*(<(?P.*)>|"(?P.*)")') +def extract_include(txt,defs): + m=re_include.search(txt) + if m: + if m.group('a'):return'<',m.group('a') + if m.group('b'):return'"',m.group('b') + tokens=process_tokens(tokens,defs,['waf_include']) + p,v=tokens[0] + if p!=STR:raise PreprocError,"could not parse include %s"%txt + return('"',v) +def parse_char(txt): + if not txt:raise PreprocError,"attempted to parse a null char" + if txt[0]!='\\': + return ord(txt) + c=txt[1] + if c=='x': + if len(txt)==4 and txt[3]in string.hexdigits:return int(txt[2:],16) + return int(txt[2:],16) + elif c.isdigit(): + if c=='0'and len(txt)==2:return 0 + for i in 3,2,1: + if len(txt)>i and txt[1:1+i].isdigit(): + return(1+i,int(txt[1:1+i],8)) + else: + try:return chr_esc[c] + except KeyError:raise PreprocError,"could not parse char literal '%s'"%txt +def tokenize(s): + ret=[] + for match in re_clexer.finditer(s): + m=match.group + for name in tok_types: + v=m(name) + if v: + if name==IDENT: + try:v=g_optrans[v];name=OP + except KeyError: + if v.lower()=="true": + v=1 + name=NUM + elif v.lower()=="false": + v=0 + name=NUM + elif name==NUM: + if m('oct'):v=int(v,8) + elif m('hex'):v=int(m('hex'),16) + elif m('n0'):v=m('n0') + else: + v=m('char') + if v:v=parse_char(v) + else:v=m('n2')or m('n4') + elif name==OP: + if v=='%:':v='#' + elif v=='%:%:':v='##' + ret.append((name,v)) + break + return ret + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/gdc.py @@ -0,0 +1,49 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import sys +import ar +def find_gdc(conf): + v=conf.env + d_compiler=None + if v['D_COMPILER']: + d_compiler=v['D_COMPILER'] + if not d_compiler:d_compiler=conf.find_program('gdc',var='D_COMPILER') + if not d_compiler:return 0 + v['D_COMPILER']=d_compiler +def common_flags(conf): + v=conf.env + v['DFLAGS']={'gdc':[],'dmd':[]} + v['D_SRC_F']='' + v['D_TGT_F']='-c -o ' + v['DPATH_ST']='-I%s' + v['D_LINKER']=v['D_COMPILER'] + v['DLNK_SRC_F']='' + v['DLNK_TGT_F']='-o ' + v['DLIB_ST']='-l%s' + v['DLIBPATH_ST']='-L%s' + v['DLINKFLAGS']=[] + v['DFLAGS_OPTIMIZED']=['-O3'] + v['DFLAGS_DEBUG']=['-O0'] + v['DFLAGS_ULTRADEBUG']=['-O0'] + v['D_shlib_DFLAGS']=[] + v['D_shlib_LINKFLAGS']=['-shared'] + v['DHEADER_ext']='.di' + v['D_HDR_F']='-fintfc -fintfc-file=' + if sys.platform=="win32": + v['D_program_PATTERN']='%s.exe' + v['D_shlib_PATTERN']='lib%s.dll' + v['D_staticlib_PATTERN']='lib%s.a' + else: + v['D_program_PATTERN']='%s' + v['D_shlib_PATTERN']='lib%s.so' + v['D_staticlib_PATTERN']='lib%s.a' +def detect(conf): + v=conf.env + find_gdc(conf) + ar.find_ar(conf) + conf.check_tool('d') + common_flags(conf) +def set_options(opt): + pass + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/msvc.py @@ -0,0 +1,314 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,re,string,optparse +import Utils,TaskGen,Runner,Configure,Task,Options +from Logs import debug,error,warn +from Utils import quote_whitespace +from TaskGen import taskgen,after,before,feature +from Configure import conftest +import ccroot +from libtool import read_la_file +from os.path import exists +def msvc_linker(task): + e=task.env + linker=e['LINK'] + srcf=e['LINK_SRC_F'] + trgtf=e['LINK_TGT_F'] + linkflags=e.get_flat('LINKFLAGS') + libdirs=e.get_flat('_LIBDIRFLAGS') + libs=e.get_flat('_LIBFLAGS') + subsystem='' + if task.subsystem: + subsystem='/subsystem:%s'%task.subsystem + outfile=task.outputs[0].bldpath(e) + manifest=outfile+'.manifest' + pdbnode=task.outputs[0].change_ext('.pdb') + pdbfile=pdbnode.bldpath(e) + objs=" ".join(['"%s"'%a.abspath(e)for a in task.inputs]) + cmd="%s %s %s%s %s%s %s %s %s"%(linker,subsystem,srcf,objs,trgtf,outfile,linkflags,libdirs,libs) + ret=Runner.exec_command(cmd,shell=False) + if ret:return ret + if os.path.exists(pdbfile): + task.outputs.append(pdbnode) + if os.path.exists(manifest): + debug('msvc: manifesttool') + mtool=e['MT'] + if not mtool: + return 0 + mode='' + if task.type=='program': + mode='1' + elif task.type=='shlib': + mode='2' + debug('msvc: embedding manifest') + flags=e['MTFLAGS'] + if flags: + flags=string.join(flags,' ') + else: + flags='' + cmd='%s %s -manifest "%s" -outputresource:"%s";#%s'%(mtool,flags,manifest,outfile,mode) + ret=Runner.exec_command(cmd) + return ret +g_msvc_systemlibs=""" +aclui activeds ad1 adptif adsiid advapi32 asycfilt authz bhsupp bits bufferoverflowu cabinet +cap certadm certidl ciuuid clusapi comctl32 comdlg32 comsupp comsuppd comsuppw comsuppwd comsvcs +credui crypt32 cryptnet cryptui d3d8thk daouuid dbgeng dbghelp dciman32 ddao35 ddao35d +ddao35u ddao35ud delayimp dhcpcsvc dhcpsapi dlcapi dnsapi dsprop dsuiext dtchelp +faultrep fcachdll fci fdi framedyd framedyn gdi32 gdiplus glauxglu32 gpedit gpmuuid +gtrts32w gtrtst32hlink htmlhelp httpapi icm32 icmui imagehlp imm32 iphlpapi iprop +kernel32 ksguid ksproxy ksuser libcmt libcmtd libcpmt libcpmtd loadperf lz32 mapi +mapi32 mgmtapi minidump mmc mobsync mpr mprapi mqoa mqrt msacm32 mscms mscoree +msdasc msimg32 msrating mstask msvcmrt msvcurt msvcurtd mswsock msxml2 mtx mtxdm +netapi32 nmapinmsupp npptools ntdsapi ntdsbcli ntmsapi ntquery odbc32 odbcbcp +odbccp32 oldnames ole32 oleacc oleaut32 oledb oledlgolepro32 opends60 opengl32 +osptk parser pdh penter pgobootrun pgort powrprof psapi ptrustm ptrustmd ptrustu +ptrustud qosname rasapi32 rasdlg rassapi resutils riched20 rpcndr rpcns4 rpcrt4 rtm +rtutils runtmchk scarddlg scrnsave scrnsavw secur32 sensapi setupapi sfc shell32 +shfolder shlwapi sisbkup snmpapi sporder srclient sti strsafe svcguid tapi32 thunk32 +traffic unicows url urlmon user32 userenv usp10 uuid uxtheme vcomp vcompd vdmdbg +version vfw32 wbemuuid webpost wiaguid wininet winmm winscard winspool winstrm +wintrust wldap32 wmiutils wow32 ws2_32 wsnmp32 wsock32 wst wtsapi32 xaswitch xolehlp +""".split() +def find_lt_names_msvc(self,libname,is_static=False): + lt_names=['lib%s.la'%libname,'%s.la'%libname,] + for path in self.libpaths: + for la in lt_names: + laf=os.path.join(path,la) + dll=None + if exists(laf): + ltdict=read_la_file(laf) + lt_libdir=None + if ltdict.has_key('libdir')and ltdict['libdir']!='': + lt_libdir=ltdict['libdir'] + if not is_static and ltdict.has_key('library_names')and ltdict['library_names']!='': + dllnames=ltdict['library_names'].split() + dll=dllnames[0].lower() + dll=re.sub('\.dll$','',dll) + return(lt_libdir,dll,False) + elif ltdict.has_key('old_library')and ltdict['old_library']!='': + olib=ltdict['old_library'] + if exists(os.path.join(path,olib)): + return(path,olib,True) + elif lt_libdir!=''and exists(os.path.join(lt_libdir,olib)): + return(lt_libdir,olib,True) + else: + return(None,olib,True) + else: + raise Utils.WafError('invalid libtool object file: %s'%laf) + return(None,None,None) +def libname_msvc(self,libname,is_static=False): + lib=libname.lower() + lib=re.sub('\.lib$','',lib) + if lib in g_msvc_systemlibs: + return lib+'.lib' + lib=re.sub('^lib','',lib) + if lib=='m': + return None + (lt_path,lt_libname,lt_static)=find_lt_names_msvc(self,lib,is_static) + if lt_path!=None and lt_libname!=None: + if lt_static==True: + return os.path.join(lt_path,lt_libname) + if lt_path!=None: + _libpaths=[lt_path]+self.libpaths + else: + _libpaths=self.libpaths + static_libs=['%ss.lib'%lib,'lib%ss.lib'%lib,'%s.lib'%lib,'lib%s.lib'%lib,] + dynamic_libs=['lib%s.dll.lib'%lib,'lib%s.dll.a'%lib,'%s.dll.lib'%lib,'%s.dll.a'%lib,'lib%s_d.lib'%lib,'%s_d.lib'%lib,'%s.lib'%lib,] + libnames=static_libs + if not is_static: + libnames=dynamic_libs+static_libs + for path in _libpaths: + for libn in libnames: + if os.path.exists(os.path.join(path,libn)): + debug('msvc: lib found: %s'%os.path.join(path,libn)) + return libn + return None +def apply_msvc_obj_vars(self): + env=self.env + app=env.append_unique + cpppath_st=env['CPPPATH_ST'] + lib_st=env['LIB_ST'] + staticlib_st=env['STATICLIB_ST'] + libpath_st=env['LIBPATH_ST'] + staticlibpath_st=env['STATICLIBPATH_ST'] + self.libpaths=[] + for i in env['RPATH']:app('LINKFLAGS',i) + for i in env['LIBPATH']: + app('LINKFLAGS',libpath_st%i) + if not self.libpaths.count(i): + self.libpaths.append(i) + for i in env['LIBPATH']: + app('LINKFLAGS',staticlibpath_st%i) + if not self.libpaths.count(i): + self.libpaths.append(i) + if not env['FULLSTATIC']: + if env['STATICLIB']or env['LIB']: + app('LINKFLAGS',env['SHLIB_MARKER']) + if env['STATICLIB']: + app('LINKFLAGS',env['STATICLIB_MARKER']) + for i in env['STATICLIB']: + debug('msvc: libname: %s'%i) + libname=libname_msvc(self,i,True) + debug('msvc: libnamefixed: %s'%libname) + if libname!=None: + app('LINKFLAGS',libname) + if self.env['LIB']: + for i in env['LIB']: + debug('msvc: libname: %s'%i) + libname=libname_msvc(self,i) + debug('msvc: libnamefixed: %s'%libname) + if libname!=None: + app('LINKFLAGS',libname) +def apply_link_msvc(self): + if self.type=='objects': + self.out_nodes=[] + app=self.out_nodes.append + for t in self.compiled_tasks:app(t.outputs[0]) + return + link=getattr(self,'link',None) + if not link: + if self.type=='staticlib':link='msvc_ar_link_static' + elif'cxx'in self.features:link='msvc_cxx_link' + else:link='msvc_cc_link' + linktask=self.create_task(link) + outputs=[t.outputs[0]for t in self.compiled_tasks] + linktask.set_inputs(outputs) + linktask.set_outputs(self.path.find_or_declare(ccroot.get_target_name(self))) + linktask.type=self.type + linktask.subsystem=getattr(self,'subsystem','') + self.link_task=linktask +def init_msvc(self): + if self.env['CC_NAME']=='msvc'or self.env['CXX_NAME']=='msvc': + self.meths.remove('apply_link') + else: + for x in['apply_link_msvc','apply_msvc_obj_vars']: + self.meths.remove(x) + self.libpaths=getattr(self,'libpaths','') +static_link_str='${STLIBLINK} ${LINK_SRC_F}${SRC} ${LINK_TGT_F}${TGT}' +Task.simple_task_type('msvc_ar_link_static',static_link_str,color='YELLOW',ext_in='.o') +Task.task_type_from_func('msvc_cc_link',vars=['LINK','LINK_SRC_F','LINK_TGT_F','LINKFLAGS','_LIBDIRFLAGS','_LIBFLAGS','MT','MTFLAGS'],color='YELLOW',func=msvc_linker,ext_in='.o') +Task.task_type_from_func('msvc_cxx_link',vars=['LINK','LINK_SRC_F','LINK_TGT_F','LINKFLAGS','_LIBDIRFLAGS','_LIBFLAGS','MT','MTFLAGS'],color='YELLOW',func=msvc_linker,ext_in='.o') +rc_str='${RC} ${RCFLAGS} /fo ${TGT} ${SRC}' +Task.simple_task_type('rc',rc_str,color='GREEN',before='cc cxx') +import winres +detect=''' +find_msvc +msvc_common_flags +cc_load_tools +cxx_load_tools +cc_add_flags +cxx_add_flags +''' +def find_msvc(conf): + if sys.platform!='win32': + conf.fatal('MSVC module only works under native Win32 Python! cygwin is not supported yet') + v=conf.env + cxx=None + if v['CXX']:cxx=v['CXX'] + elif'CXX'in os.environ:cxx=os.environ['CXX'] + if not cxx:cxx=conf.find_program('CL',var='CXX') + if not cxx:conf.fatal('CL was not found (compiler)') + v['CXX']=quote_whitespace(cxx) + v['CC']=v['CXX'] + v['CXX_NAME']='msvc' + v['CC_NAME']='msvc' + if not v['LINK_CXX']: + link=conf.find_program('LINK') + if link:v['LINK_CXX']=link + else:conf.fatal('LINK was not found (linker)') + v['LINK']='\"%s\"'%link + if not v['LINK_CC']:v['LINK_CC']=v['LINK_CXX'] + if not v['STLIBLINK']: + stliblink=conf.find_program('LIB') + if not stliblink:return + v['STLIBLINK']='\"%s\"'%stliblink + manifesttool=conf.find_program('MT') + if manifesttool: + v['MT']=quote_whitespace(manifesttool) + v['MTFLAGS']=['/NOLOGO'] + conf.check_tool('winres') + if not conf.env['WINRC']: + warn('Resource compiler not found. Compiling resource file is disabled') +def msvc_common_flags(conf): + v=conf.env + v['CPPFLAGS']=['/W3','/nologo','/EHsc','/errorReport:prompt'] + v['CCDEFINES']=['WIN32'] + v['CXXDEFINES']=['WIN32'] + v['_CCINCFLAGS']=[] + v['_CCDEFFLAGS']=[] + v['_CXXINCFLAGS']=[] + v['_CXXDEFFLAGS']=[] + v['CC_SRC_F']='' + v['CC_TGT_F']='/c /Fo' + v['CXX_SRC_F']='' + v['CXX_TGT_F']='/c /Fo' + v['CPPPATH_ST']='/I%s' + v['CPPFLAGS_CONSOLE']=['/SUBSYSTEM:CONSOLE'] + v['CPPFLAGS_NATIVE']=['/SUBSYSTEM:NATIVE'] + v['CPPFLAGS_POSIX']=['/SUBSYSTEM:POSIX'] + v['CPPFLAGS_WINDOWS']=['/SUBSYSTEM:WINDOWS'] + v['CPPFLAGS_WINDOWSCE']=['/SUBSYSTEM:WINDOWSCE'] + v['CPPFLAGS_CRT_MULTITHREADED']=['/MT'] + v['CPPFLAGS_CRT_MULTITHREADED_DLL']=['/MD'] + v['CPPDEFINES_CRT_MULTITHREADED']=['_MT'] + v['CPPDEFINES_CRT_MULTITHREADED_DLL']=['_MT','_DLL'] + v['CPPFLAGS_CRT_MULTITHREADED_DBG']=['/MTd'] + v['CPPFLAGS_CRT_MULTITHREADED_DLL_DBG']=['/MDd'] + v['CPPDEFINES_CRT_MULTITHREADED_DBG']=['_DEBUG','_MT'] + v['CPPDEFINES_CRT_MULTITHREADED_DLL_DBG']=['_DEBUG','_MT','_DLL'] + v['CCFLAGS']=['/TC'] + v['CCFLAGS_OPTIMIZED']=['/O2','/DNDEBUG'] + v['CCFLAGS_RELEASE']=['/O2','/DNDEBUG'] + v['CCFLAGS_DEBUG']=['/Od','/RTC1','/D_DEBUG','/ZI'] + v['CCFLAGS_ULTRADEBUG']=['/Od','/RTC1','/D_DEBUG','/ZI'] + v['CXXFLAGS']=['/TP'] + v['CXXFLAGS_OPTIMIZED']=['/O2','/DNDEBUG'] + v['CXXFLAGS_RELEASE']=['/O2','/DNDEBUG'] + v['CXXFLAGS_DEBUG']=['/Od','/RTC1','/D_DEBUG','/ZI'] + v['CXXFLAGS_ULTRADEBUG']=['/Od','/RTC1','/D_DEBUG','/ZI'] + v['LIB']=[] + v['LINK_TGT_F']='/OUT:' + v['LINK_SRC_F']=' ' + v['LIB_ST']='%s.lib' + v['LIBPATH_ST']='/LIBPATH:%s' + v['STATICLIB_ST']='%s.lib' + v['STATICLIBPATH_ST']='/LIBPATH:%s' + v['CCDEFINES_ST']='/D%s' + v['CXXDEFINES_ST']='/D%s' + v['_LIBDIRFLAGS']='' + v['_LIBFLAGS']='' + v['SHLIB_MARKER']='' + v['STATICLIB_MARKER']='' + v['LINKFLAGS']=['/NOLOGO','/MACHINE:X86','/ERRORREPORT:PROMPT'] + try: + debug_level=Options.options.debug_level.upper() + except AttributeError: + debug_level=ccroot.DEBUG_LEVELS.CUSTOM + v['CCFLAGS']+=v['CCFLAGS_'+debug_level] + v['CXXFLAGS']+=v['CXXFLAGS_'+debug_level] + v['LINKFLAGS']+=v['LINKFLAGS_'+debug_level] + v['shlib_CCFLAGS']=[''] + v['shlib_CXXFLAGS']=[''] + v['shlib_LINKFLAGS']=['/DLL'] + v['shlib_PATTERN']='%s.dll' + v['staticlib_LINKFLAGS']=[''] + v['staticlib_PATTERN']='%s.lib' + v['program_PATTERN']='%s.exe' +def set_options(opt): + opt.add_option('-d','--debug-level',action='store',default=ccroot.DEBUG_LEVELS.DEBUG,help="Specify the debug level, does nothing if CFLAGS is set in the environment. [Allowed Values: '%s']"%"', '".join(ccroot.DEBUG_LEVELS.ALL),choices=ccroot.DEBUG_LEVELS.ALL,dest='debug_level') + +taskgen(apply_msvc_obj_vars) +feature('cc','cxx')(apply_msvc_obj_vars) +after('apply_obj_vars_cc')(apply_msvc_obj_vars) +after('apply_obj_vars_cxx')(apply_msvc_obj_vars) +taskgen(apply_link_msvc) +feature('cc','cxx')(apply_link_msvc) +after('apply_core')(apply_link_msvc) +before('apply_obj_vars_cc')(apply_link_msvc) +before('apply_obj_vars_cxx')(apply_link_msvc) +taskgen(init_msvc) +feature('cc','cxx')(init_msvc) +before('apply_core')(init_msvc) +conftest(find_msvc) +conftest(msvc_common_flags) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/d.py @@ -0,0 +1,351 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,re,optparse +import ccroot +import TaskGen,Utils,Task,checks,Configure,Logs,Build +from Logs import debug,error +from TaskGen import taskgen,feature,after,before,extension +EXT_D=['.d','.di','.D'] +D_METHS=['apply_core','apply_vnum','apply_objdeps'] +def filter_comments(filename): + f=open(filename,'r') + txt=f.read() + f.close() + buf=[] + i=0 + max=len(txt) + while i1:self.type=k[1] + else:self.type='' + if self.type: + self.features.append('d'+self.type) + self.dflags={'gdc':'','dmd':''} + self.importpaths='' + self.libs='' + self.libpaths='' + self.uselib='' + self.uselib_local='' + self.generate_headers=False + self.compiled_tasks=[] + self.add_objects=[] + self.vnum='1.0.0' +TaskGen.bind_feature('d',D_METHS) +def apply_d_libs(self): + uselib=self.to_list(self.uselib) + seen=[] + local_libs=self.to_list(self.uselib_local) + libs=[] + libpaths=[] + env=self.env + while local_libs: + x=local_libs.pop() + if x in seen: + continue + else: + seen.append(x) + y=TaskGen.name_to_obj(x) + if not y: + raise Utils.WafError('object not found in uselib_local: obj %s uselib %s'%(self.name,x)) + if y.uselib_local: + added=0 + lst=y.to_list(y.uselib_local) + lst.reverse() + for u in lst: + if u in seen:continue + added=1 + local_libs=[u]+local_libs + if added:continue + y.post() + seen.append(x) + if'dshlib'in y.features or'dstaticlib'in y.features: + libs.append(y.target) + tmp_path=y.path.bldpath(env) + if not tmp_path in libpaths:libpaths=[tmp_path]+libpaths + if y.link_task is not None: + self.link_task.set_run_after(y.link_task) + dep_nodes=getattr(self.link_task,'dep_nodes',[]) + self.link_task.dep_nodes=dep_nodes+y.link_task.outputs + morelibs=y.to_list(y.uselib) + for v in morelibs: + if v in uselib:continue + uselib=[v]+uselib + self.uselib=uselib +def apply_d_link(self): + link=getattr(self,'link',None) + if not link: + if'dstaticlib'in self.features:link='ar_link_static' + else:link='d_link' + linktask=self.create_task(link) + outputs=[t.outputs[0]for t in self.compiled_tasks] + linktask.set_inputs(outputs) + linktask.set_outputs(self.path.find_or_declare(get_target_name(self))) + self.link_task=linktask +def apply_d_vars(self): + env=self.env + dpath_st=env['DPATH_ST'] + lib_st=env['DLIB_ST'] + libpath_st=env['DLIBPATH_ST'] + dflags={'gdc':[],'dmd':[]} + importpaths=self.to_list(self.importpaths) + libpaths=[] + libs=[] + uselib=self.to_list(self.uselib) + for i in uselib: + if env['DFLAGS_'+i]: + for dflag in self.to_list(env['DFLAGS_'+i][env['COMPILER_D']]): + if not dflag in dflags[env['COMPILER_D']]: + dflags[env['COMPILER_D']]+=[dflag] + dflags[env['COMPILER_D']]=self.to_list(self.dflags[env['COMPILER_D']])+dflags[env['COMPILER_D']] + for dflag in dflags[env['COMPILER_D']]: + if not dflag in env['DFLAGS'][env['COMPILER_D']]: + env['DFLAGS'][env['COMPILER_D']]+=[dflag] + d_shlib_dflags=env['D_'+self.type+'_DFLAGS'] + if d_shlib_dflags: + for dflag in d_shlib_dflags: + if not dflag in env['DFLAGS'][env['COMPILER_D']]: + env['DFLAGS'][env['COMPILER_D']]+=[dflag] + env['_DFLAGS']=env['DFLAGS'][env['COMPILER_D']] + for i in uselib: + if env['DPATH_'+i]: + for entry in self.to_list(env['DPATH_'+i]): + if not entry in importpaths: + importpaths.append(entry) + for path in importpaths: + if os.path.isabs(path): + env.append_unique('_DIMPORTFLAGS',dpath_st%path) + else: + node=self.path.find_dir(path) + self.env.append_unique('INC_PATHS',node) + env.append_unique('_DIMPORTFLAGS',dpath_st%node.srcpath(env)) + env.append_unique('_DIMPORTFLAGS',dpath_st%node.bldpath(env)) + for i in uselib: + if env['LIBPATH_'+i]: + for entry in self.to_list(env['LIBPATH_'+i]): + if not entry in libpaths: + libpaths+=[entry] + libpaths=self.to_list(self.libpaths)+libpaths + for path in libpaths: + env.append_unique('_DLIBDIRFLAGS',libpath_st%path) + for i in uselib: + if env['LIB_'+i]: + for entry in self.to_list(env['LIB_'+i]): + if not entry in libs: + libs+=[entry] + libs=libs+self.to_list(self.libs) + for lib in libs: + env.append_unique('_DLIBFLAGS',lib_st%lib) + for i in uselib: + dlinkflags=env['DLINKFLAGS_'+i] + if dlinkflags: + for linkflag in dlinkflags: + env.append_unique('DLINKFLAGS',linkflag) +def add_shlib_d_flags(self): + for linkflag in self.env['D_shlib_LINKFLAGS']: + self.env.append_unique('DLINKFLAGS',linkflag) +def d_hook(self,node): + task=self.create_task(self.generate_headers and'd_with_header'or'd') + try:obj_ext=self.obj_ext + except AttributeError:obj_ext='_%d.o'%self.idx + task.inputs=[node] + task.outputs=[node.change_ext(obj_ext)] + self.compiled_tasks.append(task) + if self.generate_headers: + header_node=node.change_ext(self.env['DHEADER_ext']) + task.outputs+=[header_node] +d_str='${D_COMPILER} ${_DFLAGS} ${_DIMPORTFLAGS} ${D_SRC_F}${SRC} ${D_TGT_F}${TGT}' +d_with_header_str='${D_COMPILER} ${_DFLAGS} ${_DIMPORTFLAGS} \ +${D_HDR_F}${TGT[1].bldpath(env)} \ +${D_SRC_F}${SRC} \ +${D_TGT_F}${TGT[0].bldpath(env)}' +link_str='${D_LINKER} ${DLNK_SRC_F}${SRC} ${DLNK_TGT_F}${TGT} ${DLINKFLAGS} ${_DLIBDIRFLAGS} ${_DLIBFLAGS}' +cls=Task.simple_task_type('d',d_str,'GREEN') +cls.scan=scan +Task.simple_task_type('d_with_header',d_with_header_str,'GREEN') +Task.simple_task_type('d_link',link_str,color='YELLOW') +def generate_header(self,filename,install_path): + if not hasattr(self,'header_lst'):self.header_lst=[] + self.meths.append('process_header') + self.header_lst.append([filename,install_path]) +def process_header(self): + env=self.env + for i in getattr(self,'header_lst',[]): + node=self.path.find_resource(i[0]) + if not node: + raise Utils.WafError('file not found on d obj '+i[0]) + task=self.create_task('d_header') + task.set_inputs(node) + task.set_outputs(node.change_ext('.di')) +d_header_str='${D_COMPILER} ${D_HEADER} ${SRC}' +Task.simple_task_type('d_header',d_header_str,color='BLUE') + +taskgen(init_d) +before('apply_type_vars')(init_d) +feature('d')(init_d) +taskgen(apply_d_libs) +feature('d')(apply_d_libs) +after('apply_d_link')(apply_d_libs) +before('apply_vnum')(apply_d_libs) +taskgen(apply_d_link) +feature('dprogram','dshlib','dstaticlib')(apply_d_link) +after('apply_core')(apply_d_link) +taskgen(apply_d_vars) +feature('d')(apply_d_vars) +after('apply_core')(apply_d_vars) +taskgen(add_shlib_d_flags) +after('apply_d_vars')(add_shlib_d_flags) +feature('dshlib')(add_shlib_d_flags) +extension(EXT_D)(d_hook) +taskgen(generate_header) +taskgen(process_header) +before('apply_core')(process_header) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/osx.py @@ -0,0 +1,105 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,shutil,sys +import TaskGen,Task,Build,Options +from TaskGen import taskgen,feature,after,before +from Logs import error,debug +def create_task_macapp(self): + if self.type=='program'and self.link_task: + apptask=self.create_task('macapp',self.env) + apptask.set_inputs(self.link_task.outputs) + apptask.set_outputs(self.link_task.outputs[0].change_ext('.app')) + self.apptask=apptask +def apply_link_osx(self): + if self.env['MACAPP']or getattr(self,'mac_app',False): + self.create_task_macapp() + name=self.link_task.outputs[0].name + if self.vnum:name=name.replace('.dylib','.%s.dylib'%self.vnum) + path=os.path.join(self.env['PREFIX'],lib,name) + path='-install_name %s'%path + self.env.append_value('LINKFLAGS',path) +def apply_bundle(self): + if not'shlib'in self.features:return + if self.env['MACBUNDLE']or getattr(self,'mac_bundle',False): + self.env['shlib_PATTERN']='%s.bundle' + uselib=self.to_list(self.uselib) + if not'MACBUNDLE'in uselib:uselib.append('MACBUNDLE') +def apply_bundle_remove_dynamiclib(self): + if not'shlib'in self.features:return + if self.env['MACBUNDLE']or getattr(self,'mac_bundle',False): + self.env["LINKFLAGS"].remove("-dynamiclib") + self.env.append_value("LINKFLAGS","-bundle") +app_dirs=['Contents',os.path.join('Contents','MacOS'),os.path.join('Contents','Resources')] +app_info=''' + + + + + CFBundlePackageType + APPL + CFBundleGetInfoString + Created by Waf + CFBundleSignature + ???? + NOTE + THIS IS A GENERATED FILE, DO NOT MODIFY + CFBundleExecutable + %s + + +''' +def app_build(task): + global app_dirs + env=task.env + i=0 + for p in task.outputs: + srcfile=p.srcpath(env) + debug('osx: creating directories') + try: + os.mkdir(srcfile) + [os.makedirs(os.path.join(srcfile,d))for d in app_dirs] + except(OSError,IOError): + pass + srcprg=task.inputs[i].srcpath(env) + dst=os.path.join(srcfile,'Contents','MacOS') + debug('osx: copy %s to %s'%(srcprg,dst)) + shutil.copy(srcprg,dst) + debug('osx: generate Info.plist') + f=file(os.path.join(srcfile,"Contents","Info.plist"),"w") + f.write(app_info%os.path.basename(srcprg)) + f.close() + i+=1 + return 0 +def install_shlib(task): + nums=task.vnum.split('.') + path=self.install_path + libname=task.outputs[0].name + name3=libname.replace('.dylib','.%s.dylib'%task.vnum) + name2=libname.replace('.dylib','.%s.dylib'%nums[0]) + name1=libname + filename=task.outputs[0].abspath(task.env) + bld=Build.bld + bld.install_as(path+name3,filename,env=task.env) + bld.symlink_as(path+name2,name3) + bld.symlink_as(path+name1,name3) +def install_target_osx_cshlib(self): + if not Options.is_install:return + if getattr(self,'vnum','')and sys.platform!='win32': + self.link_task.install=install_shlib +Task.task_type_from_func('macapp',vars=[],func=app_build,after="cxx_link cc_link ar_link_static") + +taskgen(create_task_macapp) +taskgen(apply_link_osx) +after('apply_link')(apply_link_osx) +feature('cc','cxx')(apply_link_osx) +taskgen(apply_bundle) +before('apply_link')(apply_bundle) +before('apply_lib_vars')(apply_bundle) +feature('cc','cxx')(apply_bundle) +taskgen(apply_bundle_remove_dynamiclib) +after('apply_link')(apply_bundle_remove_dynamiclib) +feature('cc','cxx')(apply_bundle_remove_dynamiclib) +taskgen(install_target_osx_cshlib) +feature('osx')(install_target_osx_cshlib) +after('install_target_cshlib')(install_target_osx_cshlib) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/bison.py @@ -0,0 +1,18 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import TaskGen +def decide_ext(self,node): + c_ext='.tab.c' + if node.name.endswith('.yc'):c_ext='.tab.cc' + if'-d'in self.env['BISONFLAGS']: + return[c_ext,c_ext.replace('c','h')] + else: + return c_ext +TaskGen.declare_chain(name='bison',action='cd ${SRC[0].bld_dir(env)} && ${BISON} ${BISONFLAGS} ${SRC[0].abspath()} -o ${TGT[0].name}',ext_in='.y .yc .yy',decider=decide_ext,before='cc cxx',) +def detect(conf): + bison=conf.find_program('bison',var='BISON') + if not bison:conf.fatal("bison was not found") + v=conf.env + v['BISONFLAGS']='-d' + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/compiler_d.py @@ -0,0 +1,23 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys,imp,types +import Utils,checks,Configure,Options +def detect(conf): + if getattr(Options.options,'check_dmd_first',None): + test_for_compiler=['dmd','gdc'] + else: + test_for_compiler=['gdc','dmd'] + for d_compiler in test_for_compiler: + conf.check_tool(d_compiler) + if conf.env['D_COMPILER']: + conf.check_message("%s"%d_compiler,'',True) + conf.env["COMPILER_D"]=d_compiler + return + conf.check_message("%s"%d_compiler,'',False) +def set_options(opt): + d_compiler_opts=opt.add_option_group("D Compiler Options") + d_compiler_opts.add_option('--check-dmd-first',action="store_true",help='checks for the gdc compiler before dmd (default is the other way round)',dest='check_dmd_first',default=False) + for d_compiler in['gdc','dmd']: + opt.tool_options('%s'%d_compiler,option_group=d_compiler_opts) + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/nasm.py @@ -0,0 +1,31 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os +import TaskGen,Task +from TaskGen import taskgen,before,extension +nasm_str='${NASM} ${NASM_FLAGS} ${NASM_INCLUDES} ${SRC} -o ${TGT}' +EXT_NASM=['.s','.S','.asm','.ASM','.spp','.SPP'] +def apply_nasm_vars(self): + if hasattr(self,'nasm_flags'): + for flag in self.to_list(self.nasm_flags): + self.env.append_value('NASM_FLAGS',flag) + if hasattr(self,'includes'): + for inc in self.to_list(self.includes): + self.env.append_value('NASM_INCLUDES','-I %s'%inc.srcpath(self.env)) +def nasm_file(self,node): + o_node=node.change_ext('.o') + task=self.create_task('nasm') + task.set_inputs(node) + task.set_outputs(o_node) + self.compiled_tasks.append(task) + self.meths.append('apply_nasm_vars') +Task.simple_task_type('nasm',nasm_str,color='BLUE',ext_out='.o') +def detect(conf): + nasm=conf.find_program('nasm',var='NASM') + if not nasm:nasm=conf.find_program('yasm',var='NASM') + if not nasm:conf.fatal('could not find nasm (or yasm), install it or set PATH env var') + +taskgen(apply_nasm_vars) +before('apply_link')(apply_nasm_vars) +extension(EXT_NASM)(nasm_file) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/sunc++.py @@ -0,0 +1,66 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,optparse +import Utils,Options,Configure +import ccroot,ar +from Configure import conftest +def find_sxx(conf): + v=conf.env + cc=None + if v['CXX']:cc=v['CXX'] + elif'CXX'in os.environ:cc=os.environ['CXX'] + if not cc:cc=conf.find_program('c++',var='CXX') + if not cc:conf.fatal('sunc++ was not found') + v['CXX']=cc + v['CXX_NAME']='sun' +def sxx_common_flags(conf): + v=conf.env + v['CXX_SRC_F']='' + v['CXX_TGT_F']='-c -o ' + v['CPPPATH_ST']='-I%s' + if not v['LINK_CXX']:v['LINK_CXX']=v['CXX'] + v['CXXLNK_SRC_F']='' + v['CXXLNK_TGT_F']='-o ' + v['LIB_ST']='-l%s' + v['LIBPATH_ST']='-L%s' + v['STATICLIB_ST']='-l%s' + v['STATICLIBPATH_ST']='-L%s' + v['CXXDEFINES_ST']='-D%s' + v['SHLIB_MARKER']='-Bdynamic' + v['STATICLIB_MARKER']='-Bstatic' + v['program_PATTERN']='%s' + v['shlib_CXXFLAGS']=['-Kpic','-DPIC'] + v['shlib_LINKFLAGS']=['-G'] + v['shlib_PATTERN']='lib%s.so' + v['staticlib_LINKFLAGS']=['-Bstatic'] + v['staticlib_PATTERN']='lib%s.a' +def sxx_modifier_debug(conf,kind='cpp'): + v=conf.env + v['CXXFLAGS']=[''] + if conf.check_flags('-O2',kind=kind): + v['CXXFLAGS_OPTIMIZED']=['-O2'] + v['CXXFLAGS_RELEASE']=['-O2'] + if conf.check_flags('-g -DDEBUG',kind=kind): + v['CXXFLAGS_DEBUG']=['-g','-DDEBUG'] + if conf.check_flags('-g3 -O0 -DDEBUG',kind=kind): + v['CXXFLAGS_ULTRADEBUG']=['-g3','-O0','-DDEBUG'] + try: + debug_level=Options.options.debug_level.upper() + except AttributeError: + debug_level=ccroot.DEBUG_LEVELS.CUSTOM + v.append_value('CXXFLAGS',v['CXXFLAGS_'+debug_level]) +detect=''' +find_sxx +find_cpp +find_ar +sxx_common_flags +cxx_load_tools +cxx_check_features +sxx_modifier_debug +cxx_add_flags +''' + +conftest(find_sxx) +conftest(sxx_common_flags) +conftest(sxx_modifier_debug) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/gnome.py @@ -0,0 +1,293 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,re +import TaskGen,Utils,Runner,Task,Build,Options,Logs +import cc +from Logs import error +from TaskGen import taskgen,before,after,feature +n1_regexp=re.compile('(.*)',re.M) +n2_regexp=re.compile('(.*)',re.M) +def postinstall_schemas(prog_name): + if Options.commands['install']: + dir=Build.bld.get_install_path('${PREFIX}/etc/gconf/schemas/%s.schemas'%prog_name) + if not Options.options.destdir: + Utils.pprint('YELLOW','Installing GConf schema') + command='gconftool-2 --install-schema-file=%s 1> /dev/null'%dir + ret=Runner.exec_command(command) + else: + Utils.pprint('YELLOW','GConf schema not installed. After install, run this:') + Utils.pprint('YELLOW','gconftool-2 --install-schema-file=%s'%dir) +def postinstall_icons(): + dir=Build.bld.get_install_path('${DATADIR}/icons/hicolor') + if Options.commands['install']: + if not Options.options.destdir: + Utils.pprint('YELLOW',"Updating Gtk icon cache.") + command='gtk-update-icon-cache -q -f -t %s'%dir + ret=Runner.exec_command(command) + else: + Utils.pprint('YELLOW','Icon cache not updated. After install, run this:') + Utils.pprint('YELLOW','gtk-update-icon-cache -q -f -t %s'%dir) +def postinstall_scrollkeeper(prog_name): + if Options.commands['install']: + if os.path.iswriteable('/var/log/scrollkeeper.log'): + dir1=Build.bld.get_install_path('${PREFIX}/var/scrollkeeper') + dir2=Build.bld.get_install_path('${DATADIR}/omf/%s'%prog_name) + command='scrollkeeper-update -q -p %s -o %s'%(dir1,dir2) + ret=Runner.exec_command(command) +def postinstall(prog_name='myapp',schemas=1,icons=1,scrollkeeper=1): + if schemas:postinstall_schemas(prog_name) + if icons:postinstall_icons() + if scrollkeeper:postinstall_scrollkeeper(prog_name) +class gnome_doc_taskgen(TaskGen.task_gen): + def __init__(self,*k): + TaskGen.task_gen.__init__(self,*k) + self.default_install_path='${PREFIX}/share' + def apply(self): + self.env['APPNAME']=self.doc_module + lst=self.to_list(self.doc_linguas) + for x in lst: + tsk=self.create_task('xml2po') + node=self.path.find_resource(x+'/'+x+'.po') + src=self.path.find_resource('C/%s.xml'%self.doc_module) + out=self.path.find_or_declare('%s/%s.xml'%(x,self.doc_module)) + tsk.set_inputs([node,src]) + tsk.set_outputs(out) + tsk2=self.create_task('xsltproc2po') + out2=self.path.find_or_declare('%s/%s-%s.omf'%(x,self.doc_module,x)) + tsk2.set_outputs(out2) + node=self.path.find_resource(self.doc_module+".omf.in") + tsk2.inputs=[node,out] + tsk2.run_after.append(tsk) + if Options.is_install: + path=self.install_path+'gnome/help/%s/%s'%(self.doc_module,x) + Build.bld.install_files(self.install_path+'omf',out2.abspath(self.env)) + for y in self.to_list(self.doc_figures): + try: + os.stat(self.path.abspath()+'/'+x+'/'+y) + Common.install_as(path+'/'+y,self.path.abspath()+'/'+x+'/'+y) + except: + Common.install_as(path+'/'+y,self.path.abspath()+'/C/'+y) + Common.install_as(path+'/%s.xml'%self.doc_module,out.abspath(self.env)) +class xml_to_taskgen(TaskGen.task_gen): + def __init__(self): + TaskGen.task_gen(self) + self.source='xmlfile' + self.xslt='xlsltfile' + self.target='hey' + self.default_install_path='${PREFIX}' + self.task_created=None + def apply(self): + self.env=self.env.copy() + tree=Build.bld + xmlfile=self.path.find_resource(self.source) + xsltfile=self.path.find_resource(self.xslt) + tsk=self.create_task('xmlto') + tsk.set_inputs([xmlfile,xsltfile]) + tsk.set_outputs(xmlfile.change_ext('html')) + tsk.install_path=self.install_path +def sgml_scan(self): + node=self.inputs[0] + env=self.env + variant=node.variant(env) + fi=open(node.abspath(env),'r') + content=fi.read() + fi.close() + name=n1_regexp.findall(content)[0] + num=n2_regexp.findall(content)[0] + doc_name=name+'.'+num + return([],[doc_name]) +def sig_implicit_deps(self): + def sgml_outputs(): + dps=Build.bld.raw_deps[self.unique_id()] + name=dps[0] + self.set_outputs(self.task_generator.path.find_or_declare(name)) + tree=Build.bld + key=self.unique_id() + prev_sigs=tree.task_sigs.get(key,()) + if prev_sigs and prev_sigs[2]==self.compute_sig_implicit_deps(): + sgml_outputs() + return prev_sigs[2] + (nodes,names)=self.scan() + if Logs.verbose and Logs.zones: + debug('deps: scanner for %s returned %s %s'%(str(self),str(nodes),str(names))) + tree=Build.bld + tree.node_deps[self.unique_id()]=nodes + tree.raw_deps[self.unique_id()]=names + sgml_outputs() + sig=self.compute_sig_implicit_deps() + return sig +class gnome_sgml2man_taskgen(TaskGen.task_gen): + def __init__(self,*k,**kw): + TaskGen.task_gen.__init__(self) + self.tasks=[] + self.appname=k[0] + def apply(self): + def install_result(task): + out=task.outputs[0] + name=out.name + ext=name[-1] + env=task.env + Build.bld.install_files('DATADIR','man/man%s/'%ext,out.abspath(env),env) + tree=Build.bld + tree.rescan(self.path) + for name in Build.bld.cache_dir_contents[self.path.id]: + base,ext=os.path.splitext(name) + if ext!='.sgml':continue + task=self.create_task('sgml2man') + task.set_inputs(self.path.find_resource(name)) + task.task_generator=self + if Options.is_install:task.install=install_result + task.scan() +def add_marshal_file(self,filename,prefix,mode): + if not hasattr(self,'marshal_lst'):self.marshal_lst=[] + self.meths.append('process_marshal') + self.marshal_lst.append([filename,prefix,mode]) +def process_marshal(self): + for i in getattr(self,'marshal_lst',[]): + env=self.env.copy() + node=self.path.find_resource(i[0]) + if not node: + raise Utils.WafError('file not found on gnome obj '+i[0]) + if i[2]=='--header': + env['GGM_PREFIX']=i[1] + env['GGM_MODE']=i[2] + task=self.create_task('glib_genmarshal',env) + task.set_inputs(node) + task.set_outputs(node.change_ext('.h')) + elif i[2]=='--body': + env['GGM_PREFIX']=i[1] + env['GGM_MODE']=i[2] + outnode=node.change_ext('.c') + self.allnodes.append(outnode) + task=self.create_task('glib_genmarshal',env) + task.set_inputs(node) + task.set_outputs(node.change_ext('.c')) + else: + error("unknown type for marshal "+i[2]) +def add_dbus_file(self,filename,prefix,mode): + if not hasattr(self,'dbus_lst'):self.dbus_lst=[] + self.meths.append('process_dbus') + self.dbus_lst.append([filename,prefix,mode]) +def process_dbus(self): + for i in getattr(self,'dbus_lst',[]): + env=self.env.copy() + node=self.path.find_resource(i[0]) + if not node: + raise Utils.WafError('file not found on gnome obj '+i[0]) + env['DBT_PREFIX']=i[1] + env['DBT_MODE']=i[2] + task=self.create_task('dbus_binding_tool',env) + task.set_inputs(node) + task.set_outputs(node.change_ext('.h')) +def process_enums(self): + for x in getattr(self,'mk_enums',[]): + env=self.env.copy() + task=self.create_task('mk_enums',env) + inputs=[] + src_lst=self.to_list(x['source']) + if not src_lst: + raise Utils.WafError('missing source '+str(x)) + src_lst=[self.path.find_resource(k)for k in src_lst] + inputs+=src_lst + env['MK_SOURCE']=[k.abspath(env)for k in src_lst] + if not x['target']: + raise Utils.WafError('missing target '+str(x)) + tgt_node=self.path.find_or_declare(x['target']) + if tgt_node.name.endswith('.c'): + self.allnodes.append(tgt_node) + env['MK_TARGET']=tgt_node.abspath(env) + if x['template']: + template_node=self.path.find_resource(x['template']) + env['MK_TEMPLATE']='--template %s'%(template_node.abspath(env)) + inputs.append(template_node) + task.set_inputs(inputs) + task.set_outputs(tgt_node) +def add_glib_mkenum(self,source='',template='',target=''): + if not hasattr(self,'mk_enums'):self.mk_enums=[] + self.meths.append('process_enums') + self.mk_enums.append({'source':source,'template':template,'target':target}) +Task.simple_task_type('mk_enums','${GLIB_MKENUM} ${MK_TEMPLATE} ${MK_SOURCE} > ${MK_TARGET}','PINK',before='cc') +cls=Task.simple_task_type('sgml2man','${SGML2MAN} -o ${TGT[0].bld_dir(env)} ${SRC} > /dev/null',color='BLUE') +cls.scan=sgml_scan +cls.sig_implicit_deps=sig_implicit_deps +cls.quiet=1 +Task.simple_task_type('glib_genmarshal','${GGM} ${SRC} --prefix=${GGM_PREFIX} ${GGM_MODE} > ${TGT}',color='BLUE',before='cc') +Task.simple_task_type('dbus_binding_tool','${DBT} --prefix=${DBT_PREFIX} --mode=${DBT_MODE} --output=${TGT} ${SRC}',color='BLUE',before='cc') +Task.simple_task_type('xmlto','${XMLTO} html -m ${SRC[1].abspath(env)} ${SRC[0].abspath(env)}') +Task.simple_task_type('xml2po','${XML2PO} ${XML2POFLAGS} ${SRC} > ${TGT}',color='BLUE') +xslt_magic="""${XSLTPROC2PO} -o ${TGT[0].abspath(env)} \ +--stringparam db2omf.basename ${APPNAME} \ +--stringparam db2omf.format docbook \ +--stringparam db2omf.lang C \ +--stringparam db2omf.dtd '-//OASIS//DTD DocBook XML V4.3//EN' \ +--stringparam db2omf.omf_dir ${PREFIX}/share/omf \ +--stringparam db2omf.help_dir ${PREFIX}/share/gnome/help \ +--stringparam db2omf.omf_in ${SRC[0].abspath(env)} \ +--stringparam db2omf.scrollkeeper_cl ${SCROLLKEEPER_DATADIR}/Templates/C/scrollkeeper_cl.xml \ +${DB2OMF} ${SRC[1].abspath(env)}""" +Task.simple_task_type('xsltproc2po',xslt_magic,color='BLUE') +def detect(conf): + conf.check_tool('checks') + sgml2man=conf.find_program('docbook2man',var='SGML2MAN') + glib_genmarshal=conf.find_program('glib-genmarshal',var='GGM') + dbus_binding_tool=conf.find_program('dbus-binding-tool',var='DBT') + mk_enums_tool=conf.find_program('glib-mkenums',var='GLIB_MKENUM') + def getstr(varname): + return getattr(Options.options,varname,'') + prefix=conf.env['PREFIX'] + datadir=getstr('datadir') + libdir=getstr('libdir') + sysconfdir=getstr('sysconfdir') + localstatedir=getstr('localstatedir') + if not datadir:datadir=os.path.join(prefix,'share') + if not libdir:libdir=os.path.join(prefix,'lib') + if not sysconfdir: + if os.path.normpath(prefix)=='/usr': + sysconfdir='/etc' + else: + sysconfdir=os.path.join(prefix,'etc') + if not localstatedir: + if os.path.normpath(prefix)=='/usr': + localstatedir='/var' + else: + localstatedir=os.path.join(prefix,'var') + conf.define('GNOMELOCALEDIR',os.path.join(datadir,'locale')) + conf.define('DATADIR',datadir) + conf.define('LIBDIR',libdir) + conf.define('SYSCONFDIR',sysconfdir) + conf.define('LOCALSTATEDIR',localstatedir) + xml2po=conf.find_program('xml2po',var='XML2PO') + xsltproc2po=conf.find_program('xsltproc',var='XSLTPROC2PO') + conf.env['XML2POFLAGS']='-e -p' + conf.env['SCROLLKEEPER_DATADIR']=Utils.cmd_output("scrollkeeper-config --pkgdatadir").strip() + conf.env['DB2OMF']=Utils.cmd_output("/usr/bin/pkg-config --variable db2omf gnome-doc-utils").strip() + conf.define('ENABLE_NLS',1) + conf.define('HAVE_BIND_TEXTDOMAIN_CODESET',1) + conf.define('HAVE_DCGETTEXT',1) + conf.check_header('dlfcn.h','HAVE_DLFCN_H') + conf.define('HAVE_GETTEXT',1) + conf.check_header('inttypes.h','HAVE_INTTYPES_H') + conf.check_header('locale.h','HAVE_LOCALE_H') + conf.check_header('memory.h','HAVE_MEMORY_H') + conf.check_header('stdint.h','HAVE_STDINT_H') + conf.check_header('stdlib.h','HAVE_STDLIB_H') + conf.check_header('strings.h','HAVE_STRINGS_H') + conf.check_header('string.h','HAVE_STRING_H') + conf.check_header('sys/stat.h','HAVE_SYS_STAT_H') + conf.check_header('sys/types.h','HAVE_SYS_TYPES_H') + conf.check_header('unistd.h','HAVE_UNISTD_H') +def set_options(opt): + opt.add_option('--want-rpath',type='int',default=1,dest='want_rpath',help='set rpath to 1 or 0 [Default 1]') + for i in"execprefix datadir libdir sysconfdir localstatedir".split(): + opt.add_option('--'+i,type='string',default='',dest=i) + +taskgen(add_marshal_file) +taskgen(process_marshal) +before('apply_core')(process_marshal) +taskgen(add_dbus_file) +taskgen(process_dbus) +before('apply_core')(process_dbus) +taskgen(process_enums) +before('apply_core')(process_enums) +taskgen(add_glib_mkenum) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/UnitTest.py @@ -0,0 +1,116 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys +import Build,TaskGen,Utils,Options,Logs +import pproc +class unit_test(object): + def __init__(self): + self.returncode_ok=0 + self.num_tests_ok=0 + self.num_tests_failed=0 + self.num_tests_err=0 + self.total_num_tests=0 + self.max_label_length=0 + self.unit_tests={} + self.unit_test_results={} + self.unit_test_erroneous={} + self.change_to_testfile_dir=False + self.want_to_see_test_output=False + self.want_to_see_test_error=False + self.run_if_waf_does='check' + def run(self): + self.num_tests_ok=0 + self.num_tests_failed=0 + self.num_tests_err=0 + self.total_num_tests=0 + self.max_label_length=0 + self.unit_tests={} + self.unit_test_results={} + self.unit_test_erroneous={} + if not Options.commands[self.run_if_waf_does]:return + for obj in Build.bld.all_task_gen: + if not hasattr(obj,'unit_test'):continue + unit_test=getattr(obj,'unit_test') + if not unit_test:continue + try: + if obj.type=='program': + output=obj.path + filename=os.path.join(output.abspath(obj.env),obj.target) + srcdir=output.abspath() + label=os.path.join(output.bldpath(obj.env),obj.target) + self.max_label_length=max(self.max_label_length,len(label)) + self.unit_tests[label]=(filename,srcdir) + except KeyError: + pass + self.total_num_tests=len(self.unit_tests) + Utils.pprint('GREEN','Running the unit tests') + count=0 + result=1 + for label,file_and_src in self.unit_tests.iteritems(): + filename=file_and_src[0] + srcdir=file_and_src[1] + count+=1 + line=Build.bld.progress_line(count,self.total_num_tests,Logs.colors.GREEN,Logs.colors.NORMAL) + if Options.options.progress_bar and line: + sys.stdout.write(line) + sys.stdout.flush() + try: + kwargs={} + if self.change_to_testfile_dir: + kwargs['cwd']=srcdir + if not self.want_to_see_test_output: + kwargs['stdout']=pproc.PIPE + if not self.want_to_see_test_error: + kwargs['stderr']=pproc.PIPE + pp=pproc.Popen(filename,**kwargs) + pp.wait() + result=int(pp.returncode==self.returncode_ok) + if result: + self.num_tests_ok+=1 + else: + self.num_tests_failed+=1 + self.unit_test_results[label]=result + self.unit_test_erroneous[label]=0 + except OSError: + self.unit_test_erroneous[label]=1 + self.num_tests_err+=1 + except KeyboardInterrupt: + pass + if Options.options.progress_bar:sys.stdout.write(Logs.colors.cursor_on) + def print_results(self): + if not Options.commands[self.run_if_waf_does]:return + p=Utils.pprint + if self.total_num_tests==0: + p('YELLOW','No unit tests present') + return + p('GREEN','Running unit tests') + print + for label,filename in self.unit_tests.iteritems(): + err=0 + result=0 + try:err=self.unit_test_erroneous[label] + except KeyError:pass + try:result=self.unit_test_results[label] + except KeyError:pass + n=self.max_label_length-len(label) + if err:n+=4 + elif result:n+=7 + else:n+=3 + line='%s %s'%(label,'.'*n) + print line, + if err:p('RED','ERROR') + elif result:p('GREEN','OK') + else:p('YELLOW','FAILED') + percentage_ok=float(self.num_tests_ok)/float(self.total_num_tests)*100.0 + percentage_failed=float(self.num_tests_failed)/float(self.total_num_tests)*100.0 + percentage_erroneous=float(self.num_tests_err)/float(self.total_num_tests)*100.0 + print''' +Successful tests: %i (%.1f%%) +Failed tests: %i (%.1f%%) +Erroneous tests: %i (%.1f%%) + +Total number of tests: %i +'''%(self.num_tests_ok,percentage_ok,self.num_tests_failed,percentage_failed,self.num_tests_err,percentage_erroneous,self.total_num_tests) + p('GREEN','Unit tests finished') + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/DirWatch.py @@ -0,0 +1,332 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import select,errno,os,time +module=None +def check_support(): + try: + import gamin + except ImportError: + pass + else: + try: + test=gamin.WatchMonitor() + test.disconnect() + test=None + except: + pass + else: + module=gamin + try: + import _fam + except ImportError: + support=False + else: + try: + test=_fam.open() + test.close() + test=None + except: + pass + else: + module=_fam +g_daemonlock=0 +def call_back(idxName,pathName,event): + global g_daemonlock + if g_daemonlock:return + g_daemonlock=1 + try: + main() + except Utils.WafError,e: + error(e) + g_daemonlock=0 +def start_daemon(): + global g_dirwatch + if not g_dirwatch: + g_dirwatch=DirectoryWatcher() + dirs=[] + for nodeDir in Build.bld.srcnode.dirs(): + tmpstr="%s"%nodeDir + tmpstr="%s"%(tmpstr[6:]) + dirs.append(tmpstr) + g_dirwatch.add_watch("tmp Test",call_back,dirs) + g_dirwatch.loop() + g_dirwatch=None + else: + g_dirwatch.suspend_all_watch() + dirs=[] + for nodeDir in Build.bld.srcnode.dirs(): + tmpstr="%s"%nodeDir + tmpstr="%s"%(tmpstr[6:]) + dirs.append(tmpstr) + g_dirwatch.add_watch("tmp Test",call_back,dirs) +class WatchObject: + def __init__(self,idxName,namePath,isDir,callBackThis,handleEvents): + self._adaptor=None + self._fr=None + self._idxName=idxName + self._name=namePath + self._isDir=isDir + self._callBackThis=callBackThis + self._handleEvents=handleEvents + def __del__(self): + self.unwatch() + def watch(self,adaptor): + self._adaptor=adaptor + if self._fr!=None: + self.unwatch() + if self._isDir: + self._fr=self._adaptor.watch_directory(self._name,self._idxName) + else: + self._fr=self._adaptor.watch_file(self._name,self._idxName) + def unwatch(self): + if self._fr: + self._fr=self._adaptor.stop_watch(self._name) + def get_events(self): + return self._handleEvents + def get_callback(self): + return self._callBackThis + def get_fullpath(self,fileName): + return os.path.join(self._name,fileName) + def __str__(self): + if self._isDir: + return'DIR %s: '%self._name + else: + return'FILE %s: '%self._name +class DirectoryWatcher: + def __init__(self): + self._adaptor=None + self._watcher={} + self._loops=True + self.connect() + def __del__(self): + self.disconnect() + def _raise_disconnected(self): + raise"Already disconnected" + def disconnect(self): + if self._adaptor: + self.suspend_all_watch() + self._adaptor=None + def connect(self): + if self._adaptor: + self.disconnect() + global module + if not module: + self.check_support() + if module.__name__=="fam": + self._adaptor=FamAdaptor(self._processDirEvents) + elif module.__name__=="gamin": + self._adaptor=GaminAdaptor(self._processDirEvents) + else: + self._adaptor=FallbackAdaptor(self._processDirEvents) + def add_watch(self,idxName,callBackThis,dirList,handleEvents=['changed','deleted','created']): + self.remove_watch(idxName) + self._watcher[idxName]=[] + for directory in dirList: + watchObject=WatchObject(idxName,os.path.abspath(directory),1,callBackThis,handleEvents) + self._watcher[idxName].append(watchObject) + self.resume_watch(idxName) + def remove_watch(self,idxName): + if self._watcher.has_key(idxName): + self.suspend_watch(idxName) + del self._watcher[idxName] + def remove_all_watch(self): + self._watcher={} + def suspend_watch(self,idxName): + if self._watcher.has_key(idxName): + for watchObject in self._watcher[idxName]: + watchObject.unwatch() + def suspend_all_watch(self): + for idxName in self._watcher.keys(): + self.suspend_watch(idxName) + def resume_watch(self,idxName): + for watchObject in self._watcher[idxName]: + watchObject.watch(self._adaptor) + def resume_all_watch(self): + for idxName in self._watcher.keys(): + self.resume_watch(idxName) + def _processDirEvents(self,pathName,event,idxName): + if event in self._watcher[idxName][0].get_events(): + self.suspend_watch(idxName) + _watcher=self._watcher[idxName][0] + _watcher.get_callback()(idxName,_watcher.get_fullpath(pathName),event) + self.resume_watch(idxName) + def request_end_loop(self): + self._loops=False + def loop(self): + try: + self._loops=True + while self._loops and self._adaptor!=None: + self._adaptor.wait_for_event() + while self._adaptor.event_pending(): + self._adaptor.handle_events() + if not self._loops: + break + except KeyboardInterrupt: + self.request_end_loop() +class adaptor(object): + def __init__(self,event_handler): + self.event_handler=event_handler + self.watch_handler={} + def __del__(self): + if self.data: + for handle in self.watch_handler.keys(): + self.stop_watch(handle) + def event_pending(self): + self.check_init() + return self.data.event_pending() + def watch_directory(self,name,idxName): + self.check_init() + if self.watch_handler.has_key(name): + raise"dir already watched" + self.watch_handler[name]=self.do_watch_directory(name,idxName) + return self.watch_handler[name] + def watch_file(self,name,idxName): + self.check_init() + if self.watch_handler.has_key(name): + raise"file already watched" + self.watch_handler[name]=self.do_watch_file(name,idxName) + return self.watch_handler[name] + def stop_watch(self,name): + self.check_init() + if self.watch_handler.has_key(name): + self.do_stop_watch(name) + del self.watch_handler[name] + return None + def handle_events(self): + self.check_init() + self.data.handle_events() + def check_init(self): + if not self.data: + raise OSError,"Adapter not initialized" +class FamAdaptor(DirWatch.adaptor): + def __init__(self,event_handler): + DirWatch.adaptor.__init__(self,event_handler) + global module + self.data=module.open() + def __del__(self): + DirWatch.adaptor.__del__(self) + if self.data:self.data.close() + def do_add_watch_dir(self,name,idx_name): + return self.data.monitorDirectory(name,idxName) + def do_add_watch_file(self,name,idx_name): + return self.data.monitorFile(name,idxName) + def do_stop_watch(self,name): + self.watch_handler[name].cancelMonitor() + def wait_for_event(self): + self.check_init() + try: + select.select([self.data],[],[]) + except select.error,er: + errnumber,strerr=er + if errnumber!=errno.EINTR: + raise strerr + def handle_events(self): + self.check_init() + fe=self.data.nextEvent() + self._eventHandler(fe.filename,fe.code2str(),fe.userData) +class GaminAdaptor(adaptor): + def __init__(self,eventHandler): + adaptor.__init__(self,event_handler) + global module + self.data=module.WatchMonitor() + def __del__(self): + adaptor.__del__(self) + if self.data:self.data.disconnect() + def check_init(self): + if self._gamin==None: + raise"gamin not init" + def _code2str(self,event): + gaminCodes={1:"changed",2:"deleted",3:"StartExecuting",4:"StopExecuting",5:"created",6:"moved",7:"acknowledge",8:"exists",9:"endExist"} + try: + return gaminCodes[event] + except KeyError: + return"unknown" + def _eventhandler_helper(self,pathName,event,idxName): + self._eventHandler(pathName,self._code2str(event),idxName) + def do_add_watch_dir(self,name,idx_name): + return self.data.watch_directory(name,self._eventhandler_helper,idxName) + def do_add_watch_file(self,name,idx_name): + return self.data.watch_directory(name,self._eventhandler_helper,idxName) + def do_stop_watch(self,name): + self.data.stop_watch(name) + def wait_for_event(self): + self.check_init() + try: + select.select([self._gamin.get_fd()],[],[]) + except select.error,er: + errnumber,strerr=er + if errnumber!=errno.EINTR: + raise strerr +class Fallback: + class Helper: + def __init__(self,callBack,userdata): + self.currentFiles={} + self.oldFiles={} + self._firstRun=True + self.callBack=callBack + self.userdata=userdata + def isFirstRun(self): + if self._firstRun: + self._firstRun=False + return True + else: + return False + def __init__(self): + self._dirs={} + self._changeLog={} + def _traversal(self,dirName): + files=os.listdir(dirName) + firstRun=self._dirs[dirName].isFirstRun() + for filename in files: + path=os.path.join(dirName,filename) + try: + fileStat=os.stat(path) + except os.error: + continue + modifyTime=self._dirs[dirName].oldFiles.get(path) + if modifyTime is not None: + del self._dirs[dirName].oldFiles[path] + if fileStat.st_mtime>modifyTime: + self._changeLog[path]='changed' + else: + if firstRun: + self._changeLog[path]='exists' + else: + self._changeLog[path]='created' + self._dirs[dirName].currentFiles[path]=fileStat.st_mtime + def watch_directory(self,namePath,callBack,idxName): + self._dirs[namePath]=self.Helper(callBack,idxName) + return self + def unwatch_directory(self,namePath): + if self._dirs.get(namePath): + del self._dirs[namePath] + def event_pending(self): + for dirName in self._dirs.keys(): + self._dirs[dirName].oldFiles=self._dirs[dirName].currentFiles.copy() + self._dirs[dirName].currentFiles={} + self._traversal(dirName) + for deletedFile in self._dirs[dirName].oldFiles.keys(): + self._changeLog[deletedFile]='deleted' + del self._dirs[dirName].oldFiles[deletedFile] + return len(self._changeLog) + def handle_events(self): + pathName=self._changeLog.keys()[0] + event=self._changeLog[pathName] + dirName=os.path.dirname(pathName) + self._dirs[dirName].callBack(pathName,event,self._dirs[dirName].userdata) + del self._changeLog[pathName] +class FallbackAdaptor(DirWatch.adaptor): + def __init__(self,eventHandler): + DirWatch.adaptor.__init__(self,event_handler) + self.data=Fallback() + def do_add_watch_dir(self,name,idx_name): + return self.data.watch_directory(name,self._eventHandler,idxName) + def do_add_watch_file(self,name,idx_name): + return self.data.watch_directory(name,self._eventHandler,idxName) + def do_stop_watch(self,name): + self.data.unwatch_directory(name) + def wait_for_event(self): + self.check_init() + time.sleep(1) + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/gnu_dirs.py @@ -0,0 +1,70 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import re +import Utils,Options +_options=[x.split(', ')for x in''' +bindir, user executables, $(EXEC_PREFIX)/bin +sbindir, system admin executables, $(EXEC_PREFIX)/sbin +libexecdir, program executables, $(EXEC_PREFIX)/libexec +sysconfdir, read-only single-machine data, $(PREFIX)/etc +sharedstatedir, modifiable architecture-independent data, $(PREFIX)/com +localstatedir, modifiable single-machine data, $(PREFIX)/var +libdir, object code libraries, $(EXEC_PREFIX)/lib +includedir, C header files, $(PREFIX)/include +oldincludedir, C header files for non-gcc, /usr/include +datarootdir, read-only arch.-independent data root, $(PREFIX)/share +datadir, read-only architecture-independent data, $(DATAROOTDIR) +infodir, info documentation, $(DATAROOTDIR)/info +localedir, locale-dependent data, $(DATAROOTDIR)/locale +mandir, man documentation, $(DATAROOTDIR)/man +docdir, documentation root, $(DATAROOTDIR)/doc/$(PACKAGE) +htmldir, html documentation, $(DOCDIR) +dvidir, dvi documentation, $(DOCDIR) +pdfdir, pdf documentation, $(DOCDIR) +psdir, ps documentation, $(DOCDIR) +'''.split('\n')if x] +re_var=re.compile(r'\$\(([a-zA-Z0-9_]+)\)') +def subst_vars(foo,vars): + def repl(m): + s=m.group(1) + return s and''+vars[s]or'' + return re_var.sub(repl,foo) +def detect(conf): + def get_param(varname,default): + return getattr(Options.options,varname,'')or default + env=conf.env + env['EXEC_PREFIX']=get_param('EXEC_PREFIX',env['PREFIX']) + env['PACKAGE']=Utils.g_module.APPNAME or env['PACKAGE'] + complete=False + iter=0 + while not complete and iter(0,1,6): + cmd=[valac,'-C','--quiet',vala_flags] + else: + cmd=[valac,'-C',vala_flags] + if task.threading: + cmd.append('--thread') + if task.output_type in('shlib','staticlib'): + cmd.append('--library '+task.target) + cmd.append('--basedir '+top_src) + cmd.append('-d '+top_bld) + else: + output_dir=task.outputs[0].bld_dir(env) + cmd.append('-d %s'%output_dir) + for vapi_dir in task.vapi_dirs: + cmd.append('--vapidir=%s'%vapi_dir) + for package in task.packages: + cmd.append('--pkg %s'%package) + cmd.append(" ".join(inputs)) + result=Runner.exec_command(" ".join(cmd)) + if task.output_type in('shlib','staticlib'): + if task.packages: + filename=os.path.join(task.outputs[0].bld_dir(env),"%s.deps"%task.target) + deps=open(filename,'w') + for package in task.packages: + deps.write(package+'\n') + deps.close() + try: + src_vapi=os.path.join(top_bld,"..","%s.vapi"%task.target) + dst_vapi=task.outputs[0].bld_dir(env) + shutil.move(src_vapi,dst_vapi) + except IOError: + pass + try: + src_vapi=os.path.join(top_bld,"%s.vapi"%task.target) + dst_vapi=task.outputs[0].bld_dir(env) + shutil.move(src_vapi,dst_vapi) + except IOError: + pass + try: + src_gidl=os.path.join(top_bld,"%s.gidl"%task.target) + dst_gidl=task.outputs[0].bld_dir(env) + shutil.move(src_gidl,dst_gidl) + except IOError: + pass + return result +def vala_file(self,node): + valatask=getattr(self,"valatask",None) + if not valatask: + valatask=self.create_task('valac') + self.valatask=valatask + valatask.output_type=self.type + valatask.packages=[] + valatask.vapi_dirs=[] + valatask.target=self.target + valatask.threading=False + if hasattr(self,'packages'): + valatask.packages=Utils.to_list(self.packages) + if hasattr(self,'vapi_dirs'): + vapi_dirs=Utils.to_list(self.vapi_dirs) + for vapi_dir in vapi_dirs: + try: + valatask.vapi_dirs.append(self.path.find_dir(vapi_dir).abspath()) + valatask.vapi_dirs.append(self.path.find_dir(vapi_dir).abspath(self.env)) + except AttributeError: + Logs.warn("Unable to locate Vala API directory: '%s'"%vapi_dir) + if hasattr(self,'threading'): + valatask.threading=self.threading + self.uselib=self.to_list(self.uselib) + if not'GTHREAD'in self.uselib: + self.uselib.append('GTHREAD') + env=valatask.env + output_nodes=[] + output_nodes.append(node.change_ext('.c')) + output_nodes.append(node.change_ext('.h')) + if self.type!='program': + output_nodes.append(self.path.find_or_declare('%s.vapi'%self.target)) + if env['VALAC_VERSION']>(0,1,7): + output_nodes.append(self.path.find_or_declare('%s.gidl'%self.target)) + if valatask.packages: + output_nodes.append(self.path.find_or_declare('%s.deps'%self.target)) + valatask.inputs.append(node) + valatask.outputs.extend(output_nodes) + self.allnodes.append(node.change_ext('.c')) +def detect(conf): + min_version=(0,1,6) + min_version_str="%d.%d.%d"%min_version + valac=conf.find_program('valac',var='VALAC') + if not valac: + conf.fatal("valac not found") + return + if not conf.env["HAVE_GTHREAD"]: + conf.check_pkg('gthread-2.0',destvar='GTHREAD',mandatory=False) + try: + output=Popen([valac,"--version"],stdout=PIPE).communicate()[0] + version=output.split(' ',1)[-1].strip().split(".") + version=[int(x)for x in version] + valac_version=tuple(version) + except Exception: + valac_version=(0,0,0) + conf.check_message('program version','valac >= '+min_version_str,valac_version>=min_version,"%d.%d.%d"%valac_version) + if valac_versioninclude|import|bringin){(?P[^{}]*)}',re.M) +def scan(self): + node=self.inputs[0] + env=self.env + nodes=[] + names=[] + if not node:return(nodes,names) + fi=open(node.abspath(env),'r') + code=fi.read() + fi.close() + curdirnode=self.curdirnode + abs=curdirnode.abspath() + for match in re_tex.finditer(code): + path=match.group('file') + if path: + for k in['','.tex','.ltx']: + debug('tex: trying %s%s'%(path,k)) + try: + os.stat(abs+os.sep+path+k) + except OSError: + continue + found=path+k + node=curdirnode.find_resource(found) + if node: + nodes.append(node) + else: + debug('tex: could not find %s'%path) + names.append(path) + debug("tex: found the following : %s and names %s"%(nodes,names)) + return(nodes,names) +g_bibtex_re=re.compile('bibdata',re.M) +def tex_build(task,command='LATEX'): + env=task.env + com='%s %s'%(env[command],env.get_flat(command+'FLAGS')) + if not env['PROMPT_LATEX']:com="%s %s"%(com,'-interaction=batchmode') + node=task.inputs[0] + reldir=node.bld_dir(env) + srcfile=node.srcpath(env) + lst=[] + for c in Utils.split_path(reldir): + if c:lst.append('..') + sr=os.path.join(*(lst+[srcfile])) + sr2=os.path.join(*(lst+[node.parent.srcpath(env)])) + aux_node=node.change_ext('.aux') + idx_node=node.change_ext('.idx') + hash='' + old_hash='' + nm=aux_node.name + docuname=nm[:len(nm)-4] + latex_compile_cmd='cd %s && TEXINPUTS=%s:$TEXINPUTS %s %s'%(reldir,sr2,com,sr) + warn('first pass on %s'%command) + ret=Runner.exec_command(latex_compile_cmd) + if ret:return ret + try: + file=open(aux_node.abspath(env),'r') + ct=file.read() + file.close() + except(OSError,IOError): + error('erreur bibtex scan') + else: + fo=g_bibtex_re.findall(ct) + if fo: + bibtex_compile_cmd='cd %s && BIBINPUTS=%s:$BIBINPUTS %s %s'%(reldir,sr2,env['BIBTEX'],docuname) + warn('calling bibtex') + ret=Runner.exec_command(bibtex_compile_cmd) + if ret: + error('error when calling bibtex %s'%bibtex_compile_cmd) + return ret + try: + idx_path=idx_node.abspath(env) + os.stat(idx_path) + except OSError: + error('erreur file.idx scan') + else: + makeindex_compile_cmd='cd %s && %s %s'%(reldir,env['MAKEINDEX'],idx_path) + warn('calling makeindex') + ret=Runner.exec_command(makeindex_compile_cmd) + if ret: + error('error when calling makeindex %s'%makeindex_compile_cmd) + return ret + i=0 + while i<10: + i+=1 + old_hash=hash + try: + hash=Utils.h_file(aux_node.abspath(env)) + except KeyError: + error('could not read aux.h -> %s'%aux_node.abspath(env)) + pass + if hash and hash==old_hash:break + warn('calling %s'%command) + ret=Runner.exec_command(latex_compile_cmd) + if ret: + error('error when calling %s %s'%(command,latex_compile_cmd)) + return ret + return 0 +latex_vardeps=['LATEX','LATEXFLAGS'] +def latex_build(task): + return tex_build(task,'LATEX') +pdflatex_vardeps=['PDFLATEX','PDFLATEXFLAGS'] +def pdflatex_build(task): + return tex_build(task,'PDFLATEX') +g_texobjs=['latex','pdflatex'] +class tex_taskgen(TaskGen.task_gen): + def __init__(self,*k,**kw): + TaskGen.task_gen.__init__(self,*k) + global g_texobjs + self.type=kw['type'] + if not self.type in g_texobjs: + raise Utils.WafError('type %s not supported for texobj'%type) + self.outs='' + self.prompt=1 + self.deps='' + def apply(self): + tree=Build.bld + outs=self.outs.split() + self.env['PROMPT_LATEX']=self.prompt + deps_lst=[] + if self.deps: + deps=self.to_list(self.deps) + for filename in deps: + n=self.path.find_resource(filename) + if not n in deps_lst:deps_lst.append(n) + for filename in self.source.split(): + base,ext=os.path.splitext(filename) + node=self.path.find_resource(filename) + if not node:raise Utils.WafError('cannot find %s'%filename) + if self.type=='latex': + task=self.create_task('latex') + task.set_inputs(node) + task.set_outputs(node.change_ext('.dvi')) + elif self.type=='pdflatex': + task=self.create_task('pdflatex') + task.set_inputs(node) + task.set_outputs(node.change_ext('.pdf')) + else: + raise Utils.WafError('no type or invalid type given in tex object (should be latex or pdflatex)') + task.env=self.env + task.curdirnode=self.path + if deps_lst: + variant=node.variant(self.env) + try: + lst=tree.node_deps[task.unique_id()] + for n in deps_lst: + if not n in lst: + lst.append(n) + except KeyError: + tree.node_deps[task.unique_id()]=deps_lst + if self.type=='latex': + if'ps'in outs: + pstask=self.create_task('dvips') + pstask.set_inputs(task.outputs) + pstask.set_outputs(node.change_ext('.ps')) + if'pdf'in outs: + pdftask=self.create_task('dvipdf') + pdftask.set_inputs(task.outputs) + pdftask.set_outputs(node.change_ext('.pdf')) + elif self.type=='pdflatex': + if'ps'in outs: + pstask=self.create_task('pdf2ps') + pstask.set_inputs(task.outputs) + pstask.set_outputs(node.change_ext('.ps')) +def detect(conf): + v=conf.env + for p in'tex latex pdflatex bibtex dvips dvipdf ps2pdf makeindex pdf2ps'.split(): + conf.find_program(p,var=p.upper()) + v[p.upper()+'FLAGS']='' + v['DVIPSFLAGS']='-Ppdf' +b=Task.simple_task_type +b('tex','${TEX} ${TEXFLAGS} ${SRC}',color='BLUE') +b('bibtex','${BIBTEX} ${BIBTEXFLAGS} ${SRC}',color='BLUE') +b('dvips','${DVIPS} ${DVIPSFLAGS} ${SRC} -o ${TGT}',color='BLUE',after="latex pdflatex tex bibtex") +b('dvipdf','${DVIPDF} ${DVIPDFFLAGS} ${SRC} ${TGT}',color='BLUE',after="latex pdflatex tex bibtex") +b('pdf2ps','${PDF2PS} ${PDF2PSFLAGS} ${SRC} ${TGT}',color='BLUE',after="dvipdf pdflatex") +b=Task.task_type_from_func +cls=b('latex',latex_build,vars=latex_vardeps) +cls.scan=scan +cls=b('pdflatex',pdflatex_build,vars=pdflatex_vardeps) +cls.scan=scan + --- /dev/null +++ radare-1.5.2/wafadmin/Tools/suncc.py @@ -0,0 +1,71 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,optparse +import Utils,Options,Configure +import ccroot,ar +from Configure import conftest +def find_scc(conf): + v=conf.env + cc=None + if v['CC']:cc=v['CC'] + elif'CC'in os.environ:cc=os.environ['CC'] + if not cc:cc=conf.find_program('cc',var='CC') + if not cc:conf.fatal('suncc was not found') + try: + if not Utils.cmd_output('%s -flags'%cc): + conf.fatal('suncc %r was not found'%cc) + except ValueError: + conf.fatal('suncc -flags could not be executed') + v['CC']=cc + v['CC_NAME']='sun' +def scc_common_flags(conf): + v=conf.env + v['CC_SRC_F']='' + v['CC_TGT_F']='-c -o ' + v['CPPPATH_ST']='-I%s' + if not v['LINK_CC']:v['LINK_CC']=v['CC'] + v['CCLNK_SRC_F']='' + v['CCLNK_TGT_F']='-o ' + v['LIB_ST']='-l%s' + v['LIBPATH_ST']='-L%s' + v['STATICLIB_ST']='-l%s' + v['STATICLIBPATH_ST']='-L%s' + v['CCDEFINES_ST']='-D%s' + v['SHLIB_MARKER']='-Bdynamic' + v['STATICLIB_MARKER']='-Bstatic' + v['program_PATTERN']='%s' + v['shlib_CCFLAGS']=['-Kpic','-DPIC'] + v['shlib_LINKFLAGS']=['-G'] + v['shlib_PATTERN']='lib%s.so' + v['staticlib_LINKFLAGS']=['-Bstatic'] + v['staticlib_PATTERN']='lib%s.a' +def scc_modifier_debug(conf): + v=conf.env + v['CCFLAGS']=['-O'] + if conf.check_flags('-O2'): + v['CCFLAGS_OPTIMIZED']=['-O2'] + v['CCFLAGS_RELEASE']=['-O2'] + if conf.check_flags('-g -DDEBUG'): + v['CCFLAGS_DEBUG']=['-g','-DDEBUG'] + if conf.check_flags('-g3 -O0 -DDEBUG'): + v['CCFLAGS_ULTRADEBUG']=['-g3','-O0','-DDEBUG'] + try: + debug_level=Options.options.debug_level.upper() + except AttributeError: + debug_level=ccroot.DEBUG_LEVELS.CUSTOM + v.append_value('CCFLAGS',v['CCFLAGS_'+debug_level]) +detect=''' +find_scc +find_cpp +find_ar +scc_common_flags +cc_load_tools +cc_check_features +gcc_modifier_debug +cc_add_flags +''' + +conftest(find_scc) +conftest(scc_common_flags) +conftest(scc_modifier_debug) --- /dev/null +++ radare-1.5.2/wafadmin/Tools/winres.py @@ -0,0 +1,38 @@ +#! /usr/bin/env python +# encoding: utf-8 + +import os,sys +import TaskGen,Task +from Utils import quote_whitespace +from TaskGen import extension +EXT_WINRC=['.rc'] +winrc_str='${WINRC} ${_CPPDEFFLAGS} ${_CXXDEFFLAGS} ${_CCDEFFLAGS} ${WINRCFLAGS} ${_CPPINCFLAGS} ${_CXXINCFLAGS} ${_CCINCFLAGS} ${WINRC_TGT_F}${TGT} ${WINRC_SRC_F}${SRC}' +def rc_file(self,node): + obj_ext='.rc.o' + if self.env['WINRC_TGT_F']=='/fo ':obj_ext='.res' + rctask=self.create_task('winrc') + rctask.set_inputs(node) + rctask.set_outputs(node.change_ext(obj_ext)) + self.compiled_tasks.append(rctask) +Task.simple_task_type('winrc',winrc_str,color='BLUE',before='cc cxx') +def detect(conf): + v=conf.env + cc=os.path.basename(''.join(v['CC']).lower()) + cxx=os.path.basename(''.join(v['CXX']).lower()) + if cc in['gcc','cc','g++','c++']: + winrc=conf.find_program('windres',var='WINRC') + v['WINRC_TGT_F']='-o ' + v['WINRC_SRC_F']='-i ' + elif cc=='cl.exe'or cxx=='cl.exe': + winrc=conf.find_program('RC',var='WINRC') + v['WINRC_TGT_F']='/fo ' + v['WINRC_SRC_F']=' ' + else: + return 0 + if not winrc: + conf.fatal('winrc was not found!!') + else: + v['WINRC']=quote_whitespace(winrc) + v['WINRCFLAGS']='' + +extension(EXT_WINRC)(rc_file) debian/patches/python-default-only.patch0000644000000000000000000000345111734462123015566 0ustar Description: Only install plugins and libraries for the current default Python version Debian-Specific. Author: Stefano Rivera Forwarded: not-needed --- a/src/plug/hack/Makefile +++ b/src/plug/hack/Makefile @@ -90,9 +90,9 @@ -[ -e scriptedit.${SO} ] && ${INSTALL_LIB} scriptedit.${SO} ${DESTDIR}/${LIBDIR}/radare -[ -e perl.${SO} ] && ${INSTALL_LIB} perl.${SO} ${DESTDIR}/${LIBDIR}/radare -[ -e ruby.${SO} ] && ${INSTALL_LIB} ruby.${SO} ${DESTDIR}/${LIBDIR}/radare - -[ -e python25.${SO} ] && ${INSTALL_LIB} python25.${SO} ${DESTDIR}/${LIBDIR}/radare - -[ -e python26.${SO} ] && ${INSTALL_LIB} python26.${SO} ${DESTDIR}/${LIBDIR}/radare - -[ -e python27.${SO} ] && ${INSTALL_LIB} python27.${SO} ${DESTDIR}/${LIBDIR}/radare + -(pyversions -vd | grep -Fqx 2.5) && ${INSTALL_LIB} python25.${SO} ${DESTDIR}/${LIBDIR}/radare + -(pyversions -vd | grep -Fqx 2.6) && ${INSTALL_LIB} python26.${SO} ${DESTDIR}/${LIBDIR}/radare + -(pyversions -vd | grep -Fqx 2.7) && ${INSTALL_LIB} python27.${SO} ${DESTDIR}/${LIBDIR}/radare clean: -rm -f *.${SO} *.o --- a/api/Makefile +++ b/api/Makefile @@ -8,12 +8,8 @@ -mkdir -p ${DESTDIR}${LIBDIR}/ruby/1.8 -cp -rf ruby/radare ${DESTDIR}${LIBDIR}/ruby/1.8 # python - -mkdir -p ${DESTDIR}${LIBDIR}/python2.5/site-packages - -mkdir -p ${DESTDIR}${LIBDIR}/python2.6/site-packages - -mkdir -p ${DESTDIR}${LIBDIR}/python2.7/site-packages - -cp -rf python/radare ${DESTDIR}${LIBDIR}/python2.5/site-packages - -cp -rf python/radare ${DESTDIR}${LIBDIR}/python2.6/site-packages - -cp -rf python/radare ${DESTDIR}${LIBDIR}/python2.7/site-packages + mkdir -p ${DESTDIR}${LIBDIR}/$(shell pyversions -d)/site-packages + cp -rf python/radare ${DESTDIR}${LIBDIR}/$(shell pyversions -d)/site-packages # XXX lua -cp -rf lua/radare/api.lua ${DESTDIR}${LIBDIR}/radare/radare.lua debian/patches/fix-make-clean.dpatch0000644000000000000000000000117311734462123014570 0ustar #! /bin/sh /usr/share/dpatch/dpatch-run ## fix-make-clean.dpatch by ## ## All lines beginning with `## DP:' are a description of the patch. ## DP: make clean should remove all generated binaries @DPATCH@ diff -urNad radare-1.5.2~/src/rsc/Makefile radare-1.5.2/src/rsc/Makefile --- radare-1.5.2~/src/rsc/Makefile 2009-12-08 19:53:23.000000000 +0100 +++ radare-1.5.2/src/rsc/Makefile 2010-07-06 13:01:03.000000000 +0200 @@ -31,6 +31,8 @@ clean: -rm -f main.o + -rm -f pool/Display pool/EntryDialog pool/SeekTo pool/SetRegister + cd gtk && ${MAKE} clean list: @for a in pool/${FILES}; do echo $$a; done debian/patches/series0000644000000000000000000000034411734462123012036 0ustar compile-bindings.dpatch fix-build-without-debugger.dpatch fix-make-clean.dpatch fix-linking.dpatch fix-buffer-overflow.dpatch python-versions.dpatch python-2.7.patch python-default-only.patch extract-waf fix-glib-includes.patch debian/patches/fix-build-without-debugger.dpatch0000644000000000000000000000253011734462123017153 0ustar # HG changeset patch # User pancake # Date 1278330400 -7200 # Node ID f0e7883a495bd6ddbd7abd0f2d5eb64b9db3e131 # Parent 8705ecf0858151323661ee85057704edecd816ac * Fix build --without-debugger --- a/src/Makefile.acr Fri Jul 02 10:36:28 2010 +0200 +++ b/src/Makefile.acr Mon Jul 05 13:46:40 2010 +0200 @@ -22,6 +22,9 @@ OBJ+=plug/io/debug.o OBJ2+=${psOBJ} TARGETS+=libdbg +#GDBWRAP +CFLAGS+=-Iplug/io/libgdbwrap/include +crOBJ+=plug/io/gdbwrap.o plug/io/libgdbwrap/interface.o plug/io/libgdbwrap/gdbwrapper.o ##DEBUGGER## ##WII## --- a/src/objects.mk Fri Jul 02 10:36:28 2010 +0200 +++ b/src/objects.mk Mon Jul 05 13:46:40 2010 +0200 @@ -19,9 +19,6 @@ crOBJ+=plug/io/gdb.o plug/io/posix.o plug/io/gdbx.o plug/io/socket.o pas.o ranges.o crOBJ+=plug/io/shm.o plug/io/mmap.o plug/io/malloc.o plug/io/bfdbg.o plug/io/serial.o #plug/io/winegdb.o -#GDBWRAP -CFLAGS+=-Iplug/io/libgdbwrap/include -crOBJ+=plug/io/gdbwrap.o plug/io/libgdbwrap/interface.o plug/io/libgdbwrap/gdbwrapper.o # rahash --- a/src/plugin.c Fri Jul 02 10:36:28 2010 +0200 +++ b/src/plugin.c Mon Jul 05 13:46:40 2010 +0200 @@ -290,7 +290,9 @@ plugins[last++] = socket_plugin; plugins[last++] = gxemul_plugin; plugins[last++] = bfdbg_plugin; +#if DEBUGGER plugins[last++] = gdbwrap_plugin; +#endif plugins[last++] = gdb_plugin; plugins[last++] = gdbx_plugin; debian/patches/python-2.7.patch0000644000000000000000000001266011734462123013473 0ustar Description: Add support for python2.7 Modified version of 84107891680ff532189b9ddf9a9054633a69674c Author: @schrotthaufen Author: Stefano Rivera Origin: http://radare.org/cgi-bin/hg/radare/rev/84107891680f Bug-Debian: http://bugs.debian.org/629122 --- a/api/Makefile +++ b/api/Makefile @@ -10,8 +10,10 @@ # python -mkdir -p ${DESTDIR}${LIBDIR}/python2.5/site-packages -mkdir -p ${DESTDIR}${LIBDIR}/python2.6/site-packages + -mkdir -p ${DESTDIR}${LIBDIR}/python2.7/site-packages -cp -rf python/radare ${DESTDIR}${LIBDIR}/python2.5/site-packages -cp -rf python/radare ${DESTDIR}${LIBDIR}/python2.6/site-packages + -cp -rf python/radare ${DESTDIR}${LIBDIR}/python2.7/site-packages # XXX lua -cp -rf lua/radare/api.lua ${DESTDIR}${LIBDIR}/radare/radare.lua --- a/configure.acr +++ b/configure.acr @@ -104,6 +104,7 @@ CHKLIB python2.5 CHKLIB python2.6 +CHKLIB python2.7 ARG_WITH MAEMO maemo build hildon interface ; --- a/src/plug/hack/Makefile +++ b/src/plug/hack/Makefile @@ -8,6 +8,8 @@ PY_LIBS=`python2.5-config --libs` PY26_CFLAGS=`python2.6-config --cflags` PY26_LIBS=`python2.6-config --libs` +PY27_CFLAGS=`python2.7-config --cflags` +PY27_LIBS=`python2.7-config --libs` LUA_CFLAGS=`pkg-config --cflags lua5.1` LUA_LIBS=`pkg-config --libs lua5.1` RUBY_CFLAGS=-I/usr/lib/ruby/1.8/i386-linux @@ -17,7 +19,7 @@ CFLAGS+=-I../.. -g -all: hello.${SO} lua.${SO} gtk-hello.${SO} scriptedit.${SO} python25.${SO} ruby.${SO} gtk-prefs.${SO} gtk-topbar.${SO} perl.${SO} python26.${SO} +all: hello.${SO} lua.${SO} gtk-hello.${SO} scriptedit.${SO} python25.${SO} ruby.${SO} gtk-prefs.${SO} gtk-topbar.${SO} perl.${SO} python26.${SO} python27.${SO} hello.${SO}: ${CC} ${CFLAGS} ${SHARED_CFLAGS} hello.c -o hello.${SO} @@ -55,6 +57,11 @@ -${CC} python.c ${SHARED_CFLAGS} ${LDFLAGS} ${CFLAGS} ${PY26_CFLAGS} ${PY26_LIBS} -o python26.${SO} endif +python27.${SO}: +ifeq ($(HAVE_LIB_PYTHON2_7),1) + -${CC} python.c ${SHARED_CFLAGS} ${LDFLAGS} ${CFLAGS} ${PY27_CFLAGS} ${PY27_LIBS} -o python27.${SO} +endif + perl.${SO}: -${CC} perl.c ${SHARED_CFLAGS} ${CFLAGS} ${PERL_CFLAGS} ${PERL_LIBS} -o perl.${SO} @@ -65,7 +72,7 @@ # Try with -llua and -llua5.1 (stupid ubuntu) lua.${SO}: ifeq ($(HAVE_LANG_LUA),1) -ifneq (,$(filter 1,$(HAVE_LIB_PYTHON2_5) $(HAVE_LIB_PYTHON2_6))) +ifneq (,$(filter 1,$(HAVE_LIB_PYTHON2_5) $(HAVE_LIB_PYTHON2_6) $(HAVE_LIB_PYTHON2_7))) ifneq ($(LUA_LIBS),) -${CC} lua.c ${SHARED_CFLAGS} ${CFLAGS} ${LUA_CFLAGS} ${LUA_LIBS} -o lua.so endif @@ -85,6 +92,7 @@ -[ -e ruby.${SO} ] && ${INSTALL_LIB} ruby.${SO} ${DESTDIR}/${LIBDIR}/radare -[ -e python25.${SO} ] && ${INSTALL_LIB} python25.${SO} ${DESTDIR}/${LIBDIR}/radare -[ -e python26.${SO} ] && ${INSTALL_LIB} python26.${SO} ${DESTDIR}/${LIBDIR}/radare + -[ -e python27.${SO} ] && ${INSTALL_LIB} python27.${SO} ${DESTDIR}/${LIBDIR}/radare clean: -rm -f *.${SO} *.o --- a/configure +++ b/configure @@ -282,7 +282,7 @@ shift done -ENVWORDS="MANDIR INFODIR LIBDIR INCLUDEDIR LOCALSTATEDIR SYSCONFDIR DATADIR LIBEXECDIR SBINDIR BINDIR EPREFIX PREFIX SPREFIX TARGET HOST BUILD INSTALL INSTALL_LIB INSTALL_MAN INSTALL_PROGRAM INSTALL_DIR INSTALL_SCRIPT INSTALL_DATA HOST_OS HOST_CPU BUILD_OS BUILD_CPU TARGET_OS TARGET_CPU PKGNAME VPATH VERSION CONTACT CONTACT_NAME CONTACT_MAIL CC CFLAGS LDFLAGS HAVE_LANG_C CXX CXXFLAGS HAVE_LANG_CXX LIL_ENDIAN BIG_ENDIAN BYTEORDER HAVE_LIB_DL DL_LIBS SOLARIS DARWIN SHARED_EXT SHARED_CFLAGS WINDOWS READLINE HAVE_RL_COMPLETION_MATCHES RADARE_LIBS HAVE_LIB_READLINE HAVE_USB_H USBSNF NONFREE DEBUGGER JAVA WII W32 SYSPROXY RUBY RUBY_VERSION PYTHON HAVE_LANG_PYTHON HAVE_INSTALL VALA HAVE_VALAC VALAC HAVE_VALA_1_0_VERSION_0_5_0 _CFLAGS SIZEOF_OFF_T HAVE_LIB_EWF WANT_EWF HAVE_LIB_PYTHON2_5 HAVE_LIB_PYTHON2_6 MAEMO HAVE_GUI GTK_FLAGS GTK_LIBS HAVE_PKGCFG_GTK_2_0 VTE_FLAGS VTE_LIBS HAVE_PKGCFG_VTE HAVE_VALA HAVE_LUA_H LUA HAVE_LANG_LUA LUA_LIBS HAVE_LIB_LUA HAVE_LIB_LUA5_1" +ENVWORDS="MANDIR INFODIR LIBDIR INCLUDEDIR LOCALSTATEDIR SYSCONFDIR DATADIR LIBEXECDIR SBINDIR BINDIR EPREFIX PREFIX SPREFIX TARGET HOST BUILD INSTALL INSTALL_LIB INSTALL_MAN INSTALL_PROGRAM INSTALL_DIR INSTALL_SCRIPT INSTALL_DATA HOST_OS HOST_CPU BUILD_OS BUILD_CPU TARGET_OS TARGET_CPU PKGNAME VPATH VERSION CONTACT CONTACT_NAME CONTACT_MAIL CC CFLAGS LDFLAGS HAVE_LANG_C CXX CXXFLAGS HAVE_LANG_CXX LIL_ENDIAN BIG_ENDIAN BYTEORDER HAVE_LIB_DL DL_LIBS SOLARIS DARWIN SHARED_EXT SHARED_CFLAGS WINDOWS READLINE HAVE_RL_COMPLETION_MATCHES RADARE_LIBS HAVE_LIB_READLINE HAVE_USB_H USBSNF NONFREE DEBUGGER JAVA WII W32 SYSPROXY RUBY RUBY_VERSION PYTHON HAVE_LANG_PYTHON HAVE_INSTALL VALA HAVE_VALAC VALAC HAVE_VALA_1_0_VERSION_0_5_0 _CFLAGS SIZEOF_OFF_T HAVE_LIB_EWF WANT_EWF HAVE_LIB_PYTHON2_5 HAVE_LIB_PYTHON2_6 HAVE_LIB_PYTHON2_7 MAEMO HAVE_GUI GTK_FLAGS GTK_LIBS HAVE_PKGCFG_GTK_2_0 VTE_FLAGS VTE_LIBS HAVE_PKGCFG_VTE HAVE_VALA HAVE_LUA_H LUA HAVE_LANG_LUA LUA_LIBS HAVE_LIB_LUA HAVE_LIB_LUA5_1" create_environ @@ -515,6 +515,7 @@ HAVE_LIB_EWF="0"; fi check_library HAVE_LIB_PYTHON2_5 python2.5 0 check_library HAVE_LIB_PYTHON2_6 python2.6 0 +check_library HAVE_LIB_PYTHON2_7 python2.7 0 printf 'checking pkg-config flags for gtk+-2.0... ' tmp=`pkg-config --cflags gtk+-2.0 2>/dev/null` if [ $? = 1 ]; then echo no ; HAVE_PKGCFG_GTK_2_0=0; else --- a/config.mk.acr +++ b/config.mk.acr @@ -11,6 +11,7 @@ HAVE_LUA_H=@HAVE_LUA_H@ HAVE_LIB_PYTHON2_5=@HAVE_LIB_PYTHON2_5@ HAVE_LIB_PYTHON2_6=@HAVE_LIB_PYTHON2_6@ +HAVE_LIB_PYTHON2_7=@HAVE_LIB_PYTHON2_7@ W32=@W32@ DARWIN=@DARWIN@ debian/radare-gtk.docs0000644000000000000000000000003411734462123012061 0ustar AUTHORS README doc/fortunes debian/control0000644000000000000000000001012511734462123010573 0ustar Source: radare Section: devel Priority: optional Vcs-Git: git://git.debian.org/collab-maint/radare.git Vcs-Browser: http://git.debian.org/?p=collab-maint/radare.git Maintainer: Sebastian Reichel Build-Depends: autotools-dev, debhelper (>= 7), halibut, libgtk2.0-dev, liblua5.1-0-dev, libusb-dev, libperl-dev, libreadline-dev, libvala-0.14-dev, libvte-dev, lua5.1, python-dev (>= 2.6.6-3~), valac-0.14 Standards-Version: 3.9.3 Homepage: http://radare.org Package: radare Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends}, radare-common Conflicts: radare-gtk Description: free advanced command line hexadecimal editor The project aims to create a complete, portable, multi-architecture, unix-like toolchain for reverse engineering. . It is composed by an hexadecimal editor (radare) with a wrapped IO layer supporting multiple backends for local/remote files, debugger (osx,bsd,linux,w32), stream analyzer, assembler/disassembler (rasm) for x86,arm,ppc,m68k,java,msil,sparc code analysis modules and scripting facilities. A bindiffer named radiff, base converter (rax), shellcode development helper (rasc), a binary information extractor supporting (pe, mach0, elf, class, ...) named rabin, and a block-based hash utility called rahash. . This package contains the non gtk version of radare. You won't be able to use the ag command for drawing graphs and gradare is missing. Package: radare-gtk Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends}, radare-common Conflicts: radare Description: free advanced command line hexadecimal editor with graph functionality The project aims to create a complete, portable, multi-architecture, unix-like toolchain for reverse engineering. . It is composed by an hexadecimal editor (radare) with a wrapped IO layer supporting multiple backends for local/remote files, debugger (osx,bsd,linux,w32), stream analyzer, assembler/disassembler (rasm) for x86,arm,ppc,m68k,java,msil,sparc code analysis modules and scripting facilities. A bindiffer named radiff, base converter (rax), shellcode development helper (rasc), a binary information extractor supporting (pe, mach0, elf, class, ...) named rabin, and a block-based hash utility called rahash. . This package contains the gtk enabled edition of radare. Package: radare-common Architecture: any Depends: ${shlibs:Depends}, ${misc:Depends}, ${python:Depends} Description: utilities and scripts used by radare The project aims to create a complete, portable, multi-architecture, unix-like toolchain for reverse engineering. . It is composed by an hexadecimal editor (radare) with a wrapped IO layer supporting multiple backends for local/remote files, debugger (osx,bsd,linux,w32), stream analyzer, assembler/disassembler (rasm) for x86,arm,ppc,m68k,java,msil,sparc code analysis modules and scripting facilities. A bindiffer named radiff, base converter (rax), shellcode development helper (rasc), a binary information extractor supporting (pe, mach0, elf, class, ...) named rabin, and a block-based hash utility called rahash. . This package contains utilities and scripts used by the terminal and gtk edition of radare. Package: radare-doc Section: doc Architecture: all Depends: ${misc:Depends} Description: documentary for radare The project aims to create a complete, portable, multi-architecture, unix-like toolchain for reverse engineering. . It is composed by an hexadecimal editor (radare) with a wrapped IO layer supporting multiple backends for local/remote files, debugger (osx,bsd,linux,w32), stream analyzer, assembler/disassembler (rasm) for x86,arm,ppc,m68k,java,msil,sparc code analysis modules and scripting facilities. A bindiffer named radiff, base converter (rax), shellcode development helper (rasc), a binary information extractor supporting (pe, mach0, elf, class, ...) named rabin, and a block-based hash utility called rahash. . This package contains the radare book and some additional docs. debian/radare-doc.docs0000644000000000000000000000021011734462123012035 0ustar doc/html doc/xtra doc/csr doc/debug doc/disassembly doc/elf-tutorial doc/flags doc/gdb.scripts doc/map-struct doc/shell debian/cddl.txt debian/source.lintian-overrides0000644000000000000000000000011711734462123014050 0ustar # see debian/patches/extract-waf radare source: source-contains-waf-binary waf debian/changelog0000644000000000000000000001541311734462123011047 0ustar radare (1:1.5.2-6) unstable; urgency=low * fix glib includes for glib 2.32+ (Closes: #665604) -- Sebastian Reichel Wed, 28 Mar 2012 03:05:57 +0200 radare (1:1.5.2-5) unstable; urgency=low * Fix spelling-error-in-description: extracter -> extractor * Update Debian Standards Version to 3.9.3 * Update vala build dependency to 0.14 (Closes: #663298) -- Sebastian Reichel Sat, 17 Mar 2012 04:40:16 +0100 radare (1:1.5.2-4.1) unstable; urgency=low * Non-maintainer upload. * Bug fix: "Doesn't contain source for waf binary code", thanks to Gerfried Fuchs (Closes: #654497). -- Reinhard Tartler Fri, 20 Jan 2012 21:38:41 +0100 radare (1:1.5.2-4) unstable; urgency=low [ Stefano Rivera ] * Add support for Python 2.7 (Closes: #629122) - Apply python-2.7.patch from upstream repository. * python-default-only.patch: Only install extensions for the current python default version. [ Sebastian Reichel ] * Update my mail address * Remove DM-Upload-Allowed flag * Update Debian Standards Version to 3.9.2 * Switch from python-support to dh_python2 * debian/rules: differ between build-arch and build-indep target -- Sebastian Reichel Sun, 03 Jul 2011 21:04:36 +0200 radare (1:1.5.2-3.1) unstable; urgency=medium * Non-maintainer upload. * debian/control: - Build-depend on libvala-0.10-dev, optionally with libvala-dev, as the latter has been dropped from vala package (Closes: #624429). * debian/pyversions: - Support python2.6 only. * debian/rules: - Remove files created for python2.5. -- Luca Falavigna Sun, 01 May 2011 20:15:18 +0200 radare (1:1.5.2-3) unstable; urgency=low * Update Debian Standards Version to 3.9.1 * Add patch to always link against libdl, this fixes an FTBFS on kfreebsd and hurd * add Ubuntu patch to fix buffer overflow * add Ubuntu patch to fix building if only python2.6 is available -- Sebastian Reichel Sun, 22 Aug 2010 16:55:09 +0200 radare (1:1.5.2-2) unstable; urgency=low * cherry pick upstream patch to fix disabled debugger. This should fix build on arches not supported by radare debugger. * Add patch improving Makefile's clean target * Allow DM Upload -- Sebastian Reichel Mon, 05 Jul 2010 13:59:01 +0200 radare (1:1.5.2-1) unstable; urgency=low * New upstream release - drop gtkdialog (Closes: #583888, #560676) - lua plugin should build on all arches (Closes: #544829) - many upstream bug fixes (Closes: #534700) - bashism has been fixed upstream (Closes: #535884) * Build-depend on libreadline-dev (Closes: #553833) * byte-compile Python files (Closes: #566048) * Remove ruby, does not work currently * Add watch file for -free version (DFSG compatible source tarball) * Switch to DebSrc 3.0 -- Sebastian Reichel Tue, 29 Jun 2010 14:01:34 +0200 radare (1:1.4-1) unstable; urgency=low * New hg checkout (stable release 1.4) - Fix mipsel arch translation for debugger objects - Fix FreeBSD and NetBSD build - alias mips->mips64 - Keep cons_buffer contents in radare_cmd_str - Install ranal API - Fix rsc monitor - improved shellcode injection - Fixed syscall related stuff for x86_64 - Added cmd '=' to send cmd to remote radares - Install radapy API - Implement rap:// URI handling in remote IO plugin - Fixed Elf64_Xword/Elf32_word bug in dietelf - Fix register handling for arm-darwin (debugger finally working!) - Added 'armv5tejl' (armel) to the whitelist of supported debuggers * Use python-config for detecting python cflags and libs - This should fix Ubuntu build failure -- Sebastian Reichel Wed, 03 Jun 2009 17:38:00 +0200 radare (20090525-1) unstable; urgency=low * fixed cddl.txt encoding * new hg checkout - fixups in agd and analyzing - python2.6 support - 'q' command accepts a numeric argument (exit value) - Fix a radare_cmd_str() problem related to cons_grep - Initial import of the 'ranal' api - pX is now a new print format (FMT_HEXPAIRS) - new stuff for the analysis python api - remove bashism (Closes: #530178) - change ppc_disasm license (try to get it DSFSG compliant) - Added graph.traces to colorize traced nodes in graphviz and grava - Some more random words for the book - Documentate the /P command in the book - whitelist debugger support for armelv5tel-linux (armel) - fix ignorance of --without-debugger flag - fix mipsel arch translation for debugger objects - fix FreeBSD build - mips->mips64 * ppc disasm support stays disabled - ppc_disasm license is still not DFSG compliant -- Sebastian Reichel Mon, 25 May 2009 01:05:00 +0200 radare (20090522-1) unstable; urgency=low * new hg checkout - added 'cX' command to compare like 'cc' does but using two side hexdiff dump format - specify pointer & data size in pm with % - add graph.weight - fix 'c' command (missing radare_read(0)) - added /P search command that searches for proximity in bytelevel distance - added more 'ag' and 'gu' commands to readline autocompletion. - fix build of debugger for non-linux systems - fixed code analysis at startup - more work in graphs (graph.split) - fixups in x86 code analysis -- Sebastian Reichel Fri, 22 May 2009 19:01:00 +0200 radare (20090521-1) unstable; urgency=low * new hg checkout with lots of fixups * builds on architectures without debugger support (Closes: #529997) -- Sebastian Reichel Thu, 21 May 2009 16:26:00 +0200 radare (20090519-1) unstable; urgency=low * switch to hg snapshots * uses changeset 1045 (ed3ba99d8537) * remove ppc disasm source (for DFSG compability) * configure for DFSG compatible version (--without-nonfree) * fix multiple build -- Sebastian Reichel Wed, 20 May 2009 00:00:00 +0200 radare (1.3-4) unstable; urgency=low * added even more license stuff to debian/copyright * cddl is now distributed with radare-doc because of solaris opcodes -- Sebastian Reichel Tue, 28 Apr 2009 00:23:23 +0200 radare (1.3-3) unstable; urgency=low * python.so gets installed now * added more licenses to debian/copyright * some cleanups in debian/rules -- Sebastian Reichel Mon, 27 Apr 2009 18:07:40 +0200 radare (1.3-2) unstable; urgency=low * fixed linitian reports * initial release closes correct bug -- Sebastian Reichel Sat, 25 Apr 2009 23:12:04 +0200 radare (1.3-1) unstable; urgency=low * Initial release (Closes: #523338) -- Sebastian Reichel Wed, 08 Apr 2009 17:22:25 +0200 debian/watch0000644000000000000000000000010011734462123010211 0ustar version=3 http://radare.org/get/radare-([0-9\.]*)-free\.tar\.gz debian/radare.manpages0000644000000000000000000000004711734462123012145 0ustar debian/tmp/usr/share/man/man1/radare.1 debian/radare-gtk.install0000644000000000000000000000016111734462123012600 0ustar usr/share/radare/gradare usr/bin/gradare usr/bin/radare usr/lib/radare/gtk-hello.so usr/lib/radare/scriptedit.so debian/radare-common.install0000644000000000000000000000044711734462123013312 0ustar usr/bin/rabin usr/bin/rasm usr/bin/rax usr/bin/rsc usr/bin/radiff usr/bin/xrefs usr/bin/rasc usr/bin/rahash usr/bin/rfile usr/lib/radare/bin #usr/lib/ruby1.8 usr/lib/python* usr/lib/radare/hello.so usr/lib/radare/radare.lua usr/lib/radare/lua.so usr/lib/radare/python*.so usr/share/radare/magic debian/radare-common.manpages0000644000000000000000000000057311734462123013437 0ustar debian/tmp/usr/share/man/man1/radiff.1 debian/tmp/usr/share/man/man1/rfile.1 debian/tmp/usr/share/man/man1/rabin.1 debian/tmp/usr/share/man/man1/rahash.1 debian/tmp/usr/share/man/man1/rasm.1 debian/tmp/usr/share/man/man1/xrefs.1 debian/tmp/usr/share/man/man1/rsc.1 debian/tmp/usr/share/man/man1/rasc.1 debian/tmp/usr/share/man/man1/rax.1 debian/tmp/usr/share/man/man5/radarerc.5 debian/radare.docs0000644000000000000000000000003411734462123011276 0ustar AUTHORS README doc/fortunes debian/radare-gtk.menu0000644000000000000000000000015511734462123012101 0ustar ?package(radare-gtk):needs="X11" section="Applications/Editors"\ title="Radare" command="/usr/bin/gradare" debian/copyright0000644000000000000000000003112311734462123011124 0ustar This package was debianized by Sebastian Reichel on Wed, 08 Apr 2009 17:22:25 +0200. It was downloaded from http://radare.org Upstream Author: pancake Copyright: Copyright © 2006-2010 pancake Copyright © 2008 Alfredo Pesoli Copyright © 1995-2007 Free Software Foundation, Inc. Copyright © 1999 squeak Copyright © 1998-2002 Frank Wille Copyright © 2001 Oleh Yuschuk Copyright © 2003-2006 Todd Allen Copyright © 2002-2006 Vivek Mohan Copyright © 1999 Kevin F. Quinn Copyright © 1994 Christian E. Hopps Copyright © 2007-2008 The Android Open Source Project Copyright © 2008 Daniel Pistelli Copyright © 2007 Patrick Walton Copyright © 1992-2007 DOSEMU-Development-Team Copyright © 2008 Daniel Fernandez Copyright © 2003-2005 Colin Percival Copyright © 2006 Lluis Vilanova Copyright © 2005 c0ntex Copyright © 2005 BaCkSpAcE Copyright © 1997-1998 Andrew Tridgell Copyright © 2000-2001 Aaron D. Gifford Copyright © 1991-1992 RSA Data Security, Inc. Copyright © 1995-1999 Cryptography Research, Inc. License: Files: * Copyright: see above License: GPL-2+ These files are free software; you can redistribute it and/or modify it under the terms of the GNU General Public Licence under version 2 or (at your option) any later version. Files: src/arch/sparc/sparc-*.c: Copyright: Copyright © 1989-2007 Free Software Foundation, Inc. License: GPL-3+ These files is free software; you can redistribute it and/or modify it under the terms of the GNU General Public Licence under version 3 or (at your option) any later version. Files: src/arch/mips/mips*.c: Copyright: Copyright © 1989-2007 Free Software Foundation, Inc. License: GPL-3+ These files is free software; you can redistribute it and/or modify it under the terms of the GNU General Public Licence under version 3 or (at your option) any later version. Files: src/arch/m68k/m68k_disasm.*: Copyright: Copyright © 1999-2002 Frank Wille Copyright © 1994 Christian E. Hopps License: BSD (4 clause) Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. All advertising materials mentioning features or use of this software must display the following acknowledgement: This product includes software developed by Christian E. Hopps. 4. The name of the author may not be used to endorse or promote products derived from this software without specific prior written permission THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. Files: src/arch/dalvik/dexdump/* Copyright: Copyright © 2008 The Android Open Source Project License: Apache-2.0 Licensed under the Apache License, Version 2.0. Files: src/arch/x86/cpuid.c src/arch/x86/udis86/* Copyright: Copyright © 2003-2006 Todd Allen Copyright © 2002-2007 Vivek Mohan License: BSD like Permission to use, copy, modify, distribute, and sell this software and its documentation for any purpose is hereby granted without fee, provided that the above copyright notice appear in all copies and that both the copyright notice and this permission notice appear in supporting documentation. No representations are made about the suitability of this software for any purpose. It is provided ``as is'' without express or implied warranty, including but not limited to the warranties of merchantability, fitness for a particular purpose, and noninfringement. In no event shall Todd Allen be liable for any claim, damages, or other liability, whether in action of contract, tort, or otherwise, arising from, out of, or in connection with this software. Files: src/include/opintl.h src/include/elf-bfd.h src/include/sysdep.h Copyright: Copyright © 1992-2007 Free Software Foundation, Inc. License: GPL-3+ These files is free software; you can redistribute it and/or modify it under the terms of the GNU General Public Licence under version 3 or (at your option) any later version. Files: src/plug/io/wii/grecko.c src/plug/io/wii/usbdev.c src/plug/io/wii/test.c Copyright: Copyright © 2007-2008 pancake License: GPL-3+ These files is free software; you can redistribute it and/or modify it under the terms of the GNU General Public Licence under version 3 or (at your option) any later version. Files: vala/chart.* vala/default_layout.* vala/edge.* vala/graph.* vala/layout.* vala/node.* vala/renderer.* vala/widget.* Copyright: Copyright © 2007-2008 pancake License: GPL-3+ These files is free software; you can redistribute it and/or modify it under the terms of the GNU General Public Licence under version 3 or (at your option) any later version. Files: src/rabin/dietmach* Copyright: Copyright © 2008 Alfredo Pesoli License: BSD (3 clause) Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of the author nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. Files: src/radiff/bdiff/bdiff.* Copyright: Copyright © 2008 Daniel Fernandez License: LGPL-2 These files is free software; you can redistribute it and/or modify it under the terms of the GNU Library General Public Licence under version 2. Files: src/radiff/bsdiff-4.3/bspatch.c src/radiff/bsdiff-4.3/bsdiff.c Copyright: Copyright © 2003-2005 Colin Percival License: BSD (2 clause) Redistribution and use in source and binary forms, with or without modification, are permitted providing that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. Files: src/rahash/sha2.* Copyright: Copyright © 2000-2001 Aaron D. Gifford License: BSD (2 clause) Redistribution and use in source and binary forms, with or without modification, are permitted providing that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. Files: src/rahash/md5* Copyright: Copyright © 1992-1992 RSA Data Security, Inc. Licsense: custom License to copy and use this software is granted provided that it is identified as the "RSA Data Security, Inc. MD5 Message-Digest Algorithm" in all material mentioning or referencing this software or this function. License is also granted to make and use derivative works provided that such works are identified as "derived from the RSA Data Security, Inc. MD5 Message-Digest Algorithm" in all material mentioning or referencing the derived work. RSA Data Security, Inc. makes no representations concerning either the merchantability of this software or the suitability of this software for any particular purpose. It is provided "as is" without express or implied warranty of any kind. These notices must be retained in any copies of any part of this documentation and/or software. Files: doc/xtra/solaris-sys-syscall.h Copyright: Copyright © 1984-1989 AT&T Copyright © 2008 Sun Microsystems, Inc. License: CDDL please read CDDL.txt for more information On Debian systems, the complete text of these licenses can be found in: `/usr/share/common-licenses/GPL-2` - GNU General Public License version 2 `/usr/share/common-licenses/GPL-3` - GNU General Public License version 3 `/usr/share/common-licenses/Apache-2.0` - Apache 2.0 License The Debian packaging is © 2009, Sebastian Reichel and is licensed under the GPL, see above. debian/compat0000644000000000000000000000000211734462123010367 0ustar 7 debian/source/0000755000000000000000000000000011734462123010471 5ustar debian/source/format0000644000000000000000000000001411734462123011677 0ustar 3.0 (quilt) debian/source/include-binaries0000644000000000000000000000000411734462123013623 0ustar waf debian/radare.install0000644000000000000000000000002011734462123012007 0ustar radare /usr/bin