Merge branch 'waf-2.0' into master

This commit is contained in:
Thomas Nagy 2017-09-15 21:25:45 +02:00
commit 934b02f967
No known key found for this signature in database
GPG Key ID: 49B4C67C05277AAA
130 changed files with 1913 additions and 1849 deletions

209
ChangeLog
View File

@ -1,192 +1,21 @@
NEW IN WAF 1.9.13
-----------------
* Fix a regression introduced by #1974 on Python2 with unicode characters in config.log
* Protobuf example update #2000
* Better detection for old msvc compilers #2002
* Better detection for old gcc compilers #2003
NEW IN WAF 1.9.12
-----------------
* Work around config.log encoding issues on windows/Python3.6/console #1974
* Handle spaces in python path detection on windows #1973
* Set a better default path for windows import libraries #1959
* Fix variable propagation for javac targets #1969
* Various cpplint enhancements #1961 #1963
* Various eclipse project generator enhancements #1967 #1968 #1970
* Various C# enhancements #1975 #1976 #1977
* Override resx namespaces #1978
NEW IN WAF 1.9.11
-----------------
* Detect Visual Studio 2017 Build Tools using vswhere.exe #1945
* Improve preprocessor performance on system paths in verbose mode #1950
* Better installation defaults for windows import libraries #1860
* cpplint enhancements: --cpplint-root #1953 #1956
* eclipse project generator enhancements #1957 #1958
NEW IN WAF 1.9.10
-----------------
* Detect -pie and -rdynamic in parse_flags #1936
* Fix Fortran module naming case
* Improve Visual Studio 2017 compilers detection (no vswhere/tools yet) #1925
* Prevent unnecessary Vala rebuilds due to vapi file installation
* Process .vapi files as input source files
* Reflect the current build group in bld.current_group
* Obtain NO_LOCK_IN_TOP/RUN/OUT from os.environ too
* Xcode9 generator fixes and example update #1939 #1943
NEW IN WAF 1.9.9
NEW IN WAF 2.0.0
----------------
* Fix QT5 detection on macOS 1912
* Fix Clang compilation database when no c compiler is found #1914
* Fix python 3 compatibility with cppcheck #1921
* Fix the thread index in parallel_debug
* Fix install task for mac_files #1911
* Fix msvc version detection #1907
* Add VS2017 support #1927
* Add newer fortran flag detection #1916
* Add unity builds settings per task generator
* Add run_build_cls parameter to conf.run_build/conf.check
* Add a warning for missing cflags/cxxflags in gccdeps
NEW IN WAF 1.9.8
----------------
* Keep winres flags in subsequent detections #1908
* New -qopenmp option #1900
* Enable Java compat attribute as numbers #1899
* Qt5 library detection enhancements
* Save compiler optimization flags from foo-config #1887
* Emscripten enhancements #1885
* Fix chmod with tuple rules #1884
* Include all vars in tuple rules #1883
* Compile glib2 schemas per directory #1881
* Boost libraries detection enhancements
* Work around an annoying Python 3.6 regression #1889
NEW IN WAF 1.9.7
----------------
* Conceal Python 2.6 subprocess errors on Unix-like systems
* Read variables from self.vars #1873
* Revert archive permissions to be world-readable #1875
NEW IN WAF 1.9.6
----------------
* Display @argfile contents in msvcdeps #1831
* Enable configurable preprocessor cache sizes #1833
* Allocate lru cache entries lazily
* Let unity builds use relative paths for msys #1834
* Added --dump-test-scripts to the unit test module #1852
* Added a warning on ant_glob '.' uses #1853
* LaTeX configuration tests output results to config.log
* Accept functools.partial() as task rules #1865
NEW IN WAF 1.9.5
----------------
* Fix the command-line option for the python executable (--python -> --pythondir) #1812
* Add support for older PyQt4 bindings #1813
* Expand properly `--xyz=${VAR}` when `shell=False` #1814
* Add qt5 5.7.x new libraries to qt5 tool #1815
* Add install_user and install_group to bld.install/bld.install_as/bld.symlink_as
* Reduce unnecesary build outputs verbosity #1819 #1828
* Prevent broken console encoding settings from causing additional failures #1823
* Let "waf --zones=argfile" display @argfile contents
NEW IN WAF 1.9.4
----------------
* Enable 'waf dist' in arbitrary paths #1806
* Handle subprocess timeouts in Python 3.3 #1807
* Set the configuration test build class on conf.run_build_cls
* Provide execution order constraints in parallel configuration tests
* Accept Task.cwd as str type
NEW IN WAF 1.9.3
----------------
* Improve the behaviour of parallel configuration tests (conf.multicheck) #1793
* Fix the Fortran Nag compiler detection #1797
* Detect Qt 5.7 (-std=c++11 flag) #1800
* Detect Boost python on Gentoo #1802
* Update the strip and netcache examples
NEW IN WAF 1.9.2
----------------
* Fix a Python 3 encoding error when displaying the file hash in 'waf dist' #1769
* Force qt5 static library detection with 'QT5_XCOMPILE=1 QT5_FORCE_STATIC=1 waf configure' #1588
* Force the parent os.environ value as default in pre-forked processes #1791
* Support flexflags as arguments #1782
* Update a few extensions: xcode, pytest, msvcdeps, cppcheck, win32_opts, codelite
* Store Task.last_cmd selectively, so that extensions can use them (clang_compilation_database)
* Set TaskBase.keep_last_cmd to keep the last command executed
* New pyqt5 extension #1790
* Remove all non-BSD code from the Waf archive file
NEW IN WAF 1.9.1
----------------
* Execute proceses with run_regular_process when arguments are not serializable #1764
* Do not de-duplicate configuration flags assigned through uselib_store (conf.check tests)
* Enable SVG pictures in the documentation
NEW IN WAF 1.9.0
----------------
* General enhancements:
- Detect Clang first on many platforms, in particular on FreeBSD #1528
- Remove Node.cache_sig and Node.sig so that dependencies involve file contents by default #1580
- Change cflags in the beginning / cppflags at the end #1505
- Merge ${FOO}${BAR} flags in commands executed without a shell (no spaces inserted)
- Interpret empty command-line defines as integer values for dependency calculation #1704
- Waf tools are not cached on "waf configure" by default anymore; pass conf.load(.., cache=True)
- Enable a consistent progress bar output #1641
- Add ${VAR?X} constructs in script expressions to enable simple conditional outputs
- Enable 'waf dist' to package arbitrary symlinks in tarballs #1719
- Enable regexp objects in @extension besides strings for file extensions
- Match extensions in the order of last definition
- Task generators are now processed group-by-group, so the next task generators are
processed when all tasks in a previous group are complete; bld.post_mode=POST_LAZY
becomes thus the default (playground/dynamic_build/ examples)
- Process Qt5 files in the way suggested by the Qt documentation
- Installation methods install_files/install_as/symlink_as create regular task generators
and regular tasks so that installation outputs can be re-used easily
- Subclass waflib.Build.ConfiguredContext to enable configuration-dependent user commands
- Enable @argfile procesing in Task.exec_command when argument limits are exceeded
- Apply optional tsk.env.PATH values in Task.exec_command
- Enable ut_str to process scriptlet expressions for C/C++ unit tests
- Minimize the amount of paths added to unit test environment variable
- Restore configuration values with Configure.autoconfig='clobber' #1758
- Rebuilds are applied on file contents so that update_outputs is no longer needed
* Performance highlights:
- Reduce the key size in bld.task_sigs by adding bld.node_sigs and bld.imp_sigs
- Remove __hash__ and __eq__ from Context, Node and Task #1629
- Detect visual studio versions lazily by default
- Remove the uses of run_once that can consume a lot of memory; add a proper LRU cache
- Enable pre-forked builds by default to achieve faster builds, up to 2x speedup on short-lived processes
- Enable faster consumers in Runner.py
- Add the tool 'nobuild.py' to help with performance troubleshooting
- Enable profiling with the --profile command-line option
* API changes:
- The minimum Python version required is Python 2.5
- Add Task.get_cwd()
- Remove the command called 'update'
- Remove unused variables and functions:
- TaskBase.attr()
- Build.POST_BOTH
- Options.platform
- Options.cmds
- Task.dep_vars (define Task.vars on instances if necessary)
- Utils.nogc
- Configure.err_handler
- All duplicate split() functions from Utils
- Remove the unused attribute 'mac_resources', use 'mac_files' instead (see demos/mac_app)
- Remove qt4 and kde4 from the default modules
- Refactor msvc.py
- Task.sig_vars, Task.sig_explit_deps and Task.sig_implicit_deps return None
- Use relative paths in apply_incpaths by default (and absolute ones when paths cross drives)
- Modify Utils.run_once so that it accepts a list of *args
- Better consistency between check_cfg and check_cc variables
- task_gen.mapping and task_gen.prec are not defined by default on instances anymore, but
instances can still define their own mappings to override the defaults, but in
that case all mappings/precedences must be present. These features were not used in Waf 1.8.
- Do not truncate _task suffixes from Task class names if present
* Provide a new priority system to improve scalability on complex builds
* Provide TaskGroup objects to improve scalability on complex builds
* Force new files into the build directory by default (use Node objects to bypass)
* Provide built-in support for building over UNC paths
* Simplify the Task class hierarchy; TaskBase is removed
* Display commands as string with "WAF_CMD_FORMAT=string waf build -v"
* Have ant_glob(..., generator=True) return a Python generator
* Accept nested lists and generators in bld(source=...)
* Sort TaskGen methods in alphabetical order by reversing TaskGen.prec order
* Remove 'ut_fun' from waf_unit_test.py
* Remove Node.sig and Node.cache_sig
* Remove the BuildContext.rule decorator
* Remove Task.update_outputs, Task.always_run
* Remove atleast-version, exact-version and max-version from conf.check_cfg
* Remove c_preproc.trimquotes
* Remove field_name, type_name, function_name from conf.check() tests
* Remove extras/mem_reducer.py as a better solution has been merged
* Remove Utils.ex_stack (use traceback.format_exc())

7
DEVEL
View File

@ -1,4 +1,4 @@
Waf 1.9 is on https://github.com/waf-project/waf
Waf 2.0 is on https://github.com/waf-project/waf
------------------------------------------------
waflib the core library
@ -37,8 +37,5 @@ for modules under waflib/Tools, under tests/ for platform-independent unit tests
or in playground/ for modules under waflib/extras.
The files under waflib/Tools/ are kept API-compatible for the duration
of a middle version (currently 1.9).
The development branch is called waf-2.0:
https://github.com/waf-project/waf/tree/waf-2.0
of a middle version (currently 2.0).

View File

@ -20,34 +20,35 @@ Download the project from our page on [waf.io](https://waf.io/) or from a mirror
## HOW TO CREATE THE WAF SCRIPT
Python >= 2.6 is required to generate the waf script, and the resulting file can then run on Python 2.5.
Just execute:
Just run:
```sh
$ ./waf-light configure build
```
Or, if you have several python versions installed:
Or, if several python versions are installed:
```sh
$ python3 ./waf-light configure build
```
The Waf tools in waflib/extras are not added to the waf script. To add
some of them, use the --tools switch:
some of them, use the --tools switch. An absolute path can be passed
if the module does not exist under the 'extras' folder:
```sh
$ ./waf-light --tools=compat15,swig
$ ./waf-light --tools=swig
```
To add a tool that does not exist in the folder extras, pass an absolute path, and
to customize the initialization, pass the parameter 'prelude'. Here is for example
To customize the initialization, pass the parameter 'prelude'. Here is for example
how to create a waf file using the compat15 module:
```sh
$ ./waf-light --tools=compat15 --prelude=$'\tfrom waflib.extras import compat15\n'
```
Any kind of initialization is possible, though one may prefer the build system kit (folder build\_system\_kit):
Although any kind of initialization is possible, using the build system kit
may be easier (folder build\_system\_kit):
```sh
$ ./waf-light --make-waf --tools=compat15,/comp/waf/aba.py --prelude=$'\tfrom waflib.extras import compat15\n\tprint("ok")'
```
Or, to avoid regenerating the waf file all the time, just set the `WAFDIR` environment variable to the directory containing "waflib".
To avoid regenerating the waf file all the time, just set the `WAFDIR` environment variable to the directory containing "waflib".
## HOW TO RUN THE EXAMPLES

10
TODO
View File

@ -1,10 +1,4 @@
Waf 1.9
Waf 2.1
-------
Documentation
Waf 2.0
-------
See the branch waf-2.0 for details
...

View File

@ -39,16 +39,16 @@ def dump(bld):
bld.targets = []
# store the command executed
old_exec = Task.TaskBase.exec_command
old_exec = Task.Task.exec_command
def exec_command(self, *k, **kw):
ret = old_exec(self, *k, **kw)
self.command_executed = k[0]
self.path = kw['cwd'] or self.generator.bld.cwd
return ret
Task.TaskBase.exec_command = exec_command
Task.Task.exec_command = exec_command
# perform a fake build, and accumulate the makefile bits
old_process = Task.TaskBase.process
old_process = Task.Task.process
def process(self):
old_process(self)
@ -67,7 +67,7 @@ def dump(bld):
else:
bld.commands.append(' '.join(lst))
bld.commands.append('\tcd %s && %s' % (self.path, self.command_executed))
Task.TaskBase.process = process
Task.Task.process = process
# write the makefile after the build is complete
def output_makefile(self):

View File

@ -35,9 +35,8 @@ def build(bld):
elif tp == 'objects':
features = 'c'
source = Options.options.source
app = Options.options.app
bld(features=features, source=source, target=app)
bld(features=features, source=Options.options.source, target=app)
def recurse_rep(x, y):
f = getattr(Context.g_module, x.cmd or x.fun, Utils.nada)

View File

@ -2,7 +2,7 @@
def write_header(tsk):
tsk.outputs[0].write('int abc = 423;\n')
bld(rule=write_header, target='b.h', ext_out=['.h'])
bld(features='use', rule=write_header, target='b.h', ext_out=['.h'], name='XYZ')
tg = bld.program(
features = 'aaa',
@ -11,7 +11,7 @@ tg = bld.program(
#cflags = ['-O3'], # for example
defines = ['foo=bar'],
target = 'myprogram',
use = 'M')
use = 'M XYZ')
# just for fun, make main.c depend on wscript_build
bld.add_manual_dependency('main.c', bld.path.find_resource('wscript_build'))

View File

@ -36,7 +36,7 @@ def build(bld):
# example for generated python files
target = bld.path.find_or_declare('abc.py')
bld(rule='touch ${TGT}', source='wscript', target=target)
bld(features='py', source=target, install_from=target.parent)
bld(features='py', source=[target], install_from=target.parent)
# then a c extension module
bld(

View File

@ -19,7 +19,7 @@ def build(bld):
features = 'subst', # the feature 'subst' overrides the source/target processing
source = 'foo.in', # list of string or nodes
target = 'foo.txt', # list of strings or nodes
encoding = 'ascii', # file encoding for python3, default is ISO8859-1
encoding = 'ascii', # file encoding for python3, default is latin-1
install_path = '/tmp/uff/', # installation path, optional
chmod = Utils.O755, # installation mode, optional
PREFIX = bld.env.PREFIX, # variables to use in the substitution

View File

@ -0,0 +1,5 @@
#!/usr/bin/env python
# encoding: utf-8
import sys
print('success')
sys.exit(0)

View File

@ -0,0 +1,11 @@
#! /usr/bin/env python
# encoding: utf-8
if bld.env['PYTHON']:
bld(
features = 'test_scripts',
test_scripts_source = 'test.py',
test_scripts_template = '${PYTHON} ${SCRIPT}'
)

View File

@ -0,0 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
import test_import
import sys
print('success from @NAME@')
sys.exit(@EXIT_STATUS@)

View File

@ -0,0 +1,3 @@
#!/usr/bin/env python
# encoding: utf-8
print('imported')

View File

@ -0,0 +1,29 @@
#! /usr/bin/env python
# encoding: utf-8
if bld.env['PYTHON']:
bld(
features = 'subst',
source = 'test.py.in',
target = 'test.1.py',
NAME = 'first test3',
EXIT_STATUS = '0',
)
bld(
features = 'subst',
source = 'test.py.in',
target = 'test.2.py',
NAME = 'second test3',
EXIT_STATUS = '0',
)
paths = {
'PYTHONPATH': bld.path.abspath()
}
bld(
features = 'test_scripts',
test_scripts_source = 'test.1.py test.2.py',
test_scripts_template = '${PYTHON} ${SCRIPT}',
test_scripts_paths = paths
)

View File

@ -2,7 +2,7 @@
# encoding: utf-8
def build(bld):
bld.recurse('test0 test1')
bld.recurse('test0 test1 test2 test3')
obj = bld(
features = 'cxx cxxstlib',

View File

@ -26,6 +26,9 @@ def configure(conf):
if 'dl' not in conf.env.LIB_CPPUNIT:
l = conf.check(lib='dl', uselib_store='CPPUNIT')
# the interpreted tests need python
conf.find_program('python', mandatory=False)
from waflib import Logs
def summary(bld):
lst = getattr(bld, 'utest_results', [])

View File

@ -87,7 +87,7 @@ def configure(conf):
conf.find_program('python'+x, var=x)
# unpacking the waf directory concurrently can lead to a race condition, we'll need to take care of this (thanks, build farm!)
conf.cmd_and_log(conf.env[x] + ['./waf', '--version'], env={})
except Exception as e:
except Errors.WafError as e:
pass
else:
conf.env.append_value('PYTHONS', x)

View File

@ -14,14 +14,14 @@ def build(bld):
for x in bld.path.ant_glob('*.svg'):
bld(
rule='${CONVERT} -density 600 ${SRC} ${TGT}',
source=x,
source=[x],
target=x.change_ext('.png'),
)
for x in bld.path.ant_glob('*.dia'):
bld(
rule='${DIA} -t png ${SRC} -e ${TGT}',
source=x,
source=[x],
target=x.change_ext('.png'),
)

View File

@ -98,8 +98,7 @@ def before(*k):
exclude_taskgen.append(func.__name__)
setattr(task_gen, func.__name__, func)
for fun_name in k:
if not func.__name__ in task_gen.prec[fun_name]:
task_gen.prec[fun_name].append(func.__name__)
task_gen.prec[func.__name__].add(fun_name)
fix_fun_doc(func)
append_doc(func, 'before', k)
return func
@ -112,8 +111,7 @@ def after(*k):
exclude_taskgen.append(func.__name__)
setattr(task_gen, func.__name__, func)
for fun_name in k:
if not fun_name in task_gen.prec[func.__name__]:
task_gen.prec[func.__name__].append(fun_name)
task_gen.prec[fun_name].add(func.__name__)
fix_fun_doc(func)
append_doc(func, 'after', k)
return func

View File

@ -30,14 +30,14 @@ Let's start with a new wscript file in the directory '/tmp/myproject'::
def build(bld):
print("build!")
We will also use a Waf binary file, for example waf-1.9.14, which we will copy in the project directory::
We will also use a Waf binary file, for example waf-2.0.0, which we will copy in the project directory::
$ cd /tmp/myproject
$ wget https://waf.io/waf-1.9.14
$ wget https://waf.io/waf-2.0.0
To execute the project, we will simply call the command as an argument to ``waf``::
$ ./waf-1.9.14 configure build
$ ./waf-2.0.0 configure build
configure!
build!

View File

@ -138,7 +138,7 @@ def build(bld):
bld(rule=try_compress, target=ini, always=True, kind=kind, frompath=node, files=rels)
# for the same reason, count_result will be executed each time
bld(rule=count_result, target=dist, source=ini, always=True)
bld(rule=count_result, target=dist, source=[ini], always=True)
bld(rule=write_template, target=plot, triplet=[png, kind, dist], always=True)
bld(rule='${GNUPLOT} < ${SRC[1].abspath()}', target=png, source=[dist, plot])

View File

@ -1,5 +1,5 @@
#!/usr/bin/env python
# encoding: ISO8859-1
# encoding: utf-8
# Thomas Nagy, 2010
top = '.'

View File

@ -1,5 +1,5 @@
#!/usr/bin/env python
# encoding: ISO8859-1
# encoding: utf-8
# Thomas Nagy, 2010
from waflib import Logs

View File

@ -47,26 +47,13 @@ def build(bld):
import random
rnd = random.randint(0, 25)
bld(
rule = "sleep 2 && (echo 'int num%d = %d;' > ${TGT})" % (rnd, rnd),
target = 'foo_%d.c' % rnd,
)
rule = "sleep 2 && (echo 'int num%d = %d;' > ${TGT})" % (rnd, rnd),
target = 'foo_%d.c' % rnd,
)
bld.add_group()
bld.program(source='main.c', target='app', dynamic_source='*.c')
# support for the "dynamic_source" attribute follows:
from waflib import Build, Utils, TaskGen
@TaskGen.feature('c')
@TaskGen.before('process_source', 'process_rule')
def dynamic_post(self):
"""
bld(dynamic_source='*.c', ..) will search for source files to add to the attribute 'source'
we could also call "eval" or whatever expression
"""
if not getattr(self, 'dynamic_source', None):
return
self.source = Utils.to_list(self.source)
self.source.extend(self.path.get_bld().ant_glob(self.dynamic_source, remove=False))
it = bld.path.get_bld().ant_glob('*.c', remove=False, quiet=True, generator=True)
src = ['main.c', it]
bld(features='c cprogram', source=src, target='app')

View File

@ -11,8 +11,8 @@ An advanced dynamic build simulating a call to an external system.
That external build system produces a library which is then used in the current build.
"""
import os, shutil, sys, subprocess
from waflib import Utils, Build, Logs
import os, shutil, sys
from waflib import Build, Errors, Logs
top = '.'
out = 'build'
@ -79,7 +79,7 @@ def some_fun(task):
try:
task.generator.bld.cmd_and_log(cmd, cwd=cwd, quiet=0, output=0)
except Exception as e:
except Errors.WafError as e:
try:
print(e.stderr)
except AttributeError:

View File

@ -56,7 +56,7 @@ def runnable_status(self):
tsk = mock_tasks[m_node]
except KeyError:
tsk = mock_tasks[m_node] = self.generator.create_task('mock', [h_node], [m_node])
bld.producer.outstanding.appendleft(tsk)
bld.producer.outstanding.append(tsk)
bld.producer.total += 1
# preprocessor cache :-/
@ -86,7 +86,7 @@ def runnable_status(self):
tsk = mock_tasks[x]
except KeyError:
tsk = mock_tasks[x] = self.generator.create_task('mock', [h_node], [x])
bld.producer.outstanding.appendleft(tsk)
bld.producer.outstanding.append(tsk)
bld.producer.total += 1
add = True

View File

@ -8,5 +8,5 @@ def build(bld):
node = bld.path.get_bld().make_node('test/bar/stuff')
bld(features='mkdir', target=node)
bld(rule='du ${SRC}', source=node)
bld(rule='du ${SRC}', source=[node])

View File

@ -1,18 +1,11 @@
#! /usr/bin/env python
# encoding: utf-8
def fun(task):
pass
# print task.generator.bld.name_to_obj('somelib').link_task.outputs[0].abspath(task.env)
# task.ut_exec.append('--help')
bld.program(
features = 'test',
source = 'AccumulatorTest.cpp',
target = 'unit_test_program',
use = 'unittestmain useless GTEST',
ut_cwd = bld.path.abspath(),
ut_fun = fun
)

View File

@ -95,9 +95,9 @@ def process(self):
except Exception, e:
print type(e), e
Task.TaskBase.process_bound_maxjobs = Task.TaskBase.process
Task.Task.process_bound_maxjobs = Task.Task.process
Task.Task.process = process
Task.TaskBase.lock_maxjob = lock_maxjob
Task.TaskBase.release_maxjob = release_maxjob
Task.TaskBase.wait_maxjob = wait_maxjob
Task.Task.lock_maxjob = lock_maxjob
Task.Task.release_maxjob = release_maxjob
Task.Task.wait_maxjob = wait_maxjob

View File

@ -65,7 +65,7 @@ def build_all_at_once(ctx):
f(self)
sem.release()
return f2
Task.TaskBase.process = with_sem(Task.TaskBase.process)
Task.Task.process = with_sem(Task.Task.process)
threads = []
for var in ctx.all_envs:

View File

@ -74,11 +74,8 @@ def build(bld):
vnum = '1.2.3',
use = 'mylib')
# Swig forces a dynamic build therefore a build group is necessary
from waflib import Build
bld.post_mode = Build.POST_LAZY
bld.add_group()
python_site_package = '${PREFIX}/lib/python%s/site-packages' % bld.env.PYTHON_VERSION
generated_py = bld.path.find_or_declare('extend/python/test_swig_waf.py')
bld(features='py', source=generated_py, install_path=python_site_package, install_from=bld.path.get_bld())

View File

@ -1,63 +0,0 @@
#! /usr/bin/env python
# encoding: utf-8
from waflib import Task, Runner
# amount of tasks to wait before trying to mark tasks as done
# see Runner.py for details
# setting the gap to a low value may seriously degrade performance
Runner.GAP = 1
# shrinking sets of dependencies provided to know quickly which yelow tasks are ready
reverse_map = {}
old = Task.set_file_constraints
def green_first(lst):
# weak order constraint based on lexicographic order:
# green before pink before yellow
lst.sort(cmp=lambda x, y: cmp(x.__class__.__name__, y.__class__.__name__))
# call the previous method to order the tasks by file dependencies
old(lst)
# shrinking sets preparation
for tsk in lst:
for k in tsk.run_after:
try:
reverse_map[k].add(tsk)
except KeyError:
reverse_map[k] = set([tsk])
Task.set_file_constraints = green_first
def get_out(self):
# this is called whenever a task has finished executing
tsk = self.prev_get_out()
for x in reverse_map.get(tsk, []):
# remote this task from the dependencies of other tasks
# this is a minor optimization in general
# but here we use this to know which tasks are ready to run
x.run_after.remove(tsk)
if tsk.__class__.__name__ == 'green':
# whenever a green task is done it may be time to put
# one or more yellow tasks in front
# some additional optimization can be performed if exactly one
# yellow task can be added to avoid whole list traversal
def extract_yellow(lst):
# return the yellow tasks that we can run immediately
front = []
lst.reverse()
for tsk in lst:
if tsk.__class__.__name__ == 'yellow' and not tsk.run_after:
#print("found one yellow task to run first")
lst.remove(tsk)
front.append(tsk)
lst.reverse()
return front
# this sets the order again
self.outstanding = extract_yellow(self.outstanding) + extract_yellow(self.frozen) + self.outstanding
return tsk
Runner.Parallel.prev_get_out = Runner.Parallel.get_out
Runner.Parallel.get_out = get_out

View File

@ -1,40 +0,0 @@
#! /usr/bin/env python
# encoding: utf-8
"""
Illustrates how to force yellow tasks to be executed very soon after the green ones
"""
import time
def options(ctx):
ctx.load('parallel_debug')
def configure(ctx):
ctx.load('parallel_debug')
def build(ctx):
ctx.load('greenfirst', tooldir='.')
ctx.jobs = 4
def touch(tsk):
tsk.outputs[0].write('')
if tsk.__class__.__name__ == 'green':
time.sleep(0.2)
elif tsk.__class__.__name__ == 'yellow':
time.sleep(3)
else:
time.sleep(0.3)
for x in range(40):
ctx(rule=touch, always=True, name='pink', color='PINK', target='pink%s.txt' % x)
for x in range(5):
lst = []
for y in range(60):
name = 'green%s_%s.txt' % (x, y)
lst.append(name)
ctx(rule=touch, always=True, target=name, name='green', color='GREEN')
ctx(rule=touch, always=True, source=lst, name='yellow', color='YELLOW', target='yellow%s.txt' % x)

View File

@ -3,12 +3,6 @@
import os, shutil
from waflib import Node, Build, Utils, Logs
def tt(msg, result, expected):
color = 'RED'
if result == expected:
color = 'GREEN'
Logs.pprint(color, msg.ljust(20) + " %r" % result)
def exists(path):
try:
os.stat(path)
@ -40,6 +34,15 @@ def configure(ctx):
def test(ctx):
bld = Build.BuildContext()
errors = []
def tt(msg, result, expected):
color = 'RED'
if result == expected:
color = 'GREEN'
else:
errors.append(result)
Logs.pprint(color, msg.ljust(20) + " %r" % result)
# 1. absdir is wrong, keep the drive letter
# 2. split should use os.sep
@ -119,7 +122,7 @@ def test(ctx):
nf.write("aha")
nf.get_bld_sig()
tt('find_resource src/abc', bld.srcnode.find_resource(['abc']), nf)
tt('find_or_declare src/abc', bld.srcnode.find_or_declare(['abc']), nf)
tt('find_or_declare src/abc', bld.srcnode.find_or_declare(['abc']), bld.bldnode.make_node(['abc']))
tt('src.get_bld()', bld.srcnode.get_bld(), bld.bldnode)
tt('bld.get_src()', bld.bldnode.get_src(), bld.srcnode)
@ -141,3 +144,35 @@ def test(ctx):
tt("ant_glob ->", len(bld.srcnode.ant_glob('*.txt', flat=False)), 1)
#print("ant_glob src ->", bld.srcnode.ant_glob('*.txt'))
def abspath(self):
try:
return self.cache_abspath
except AttributeError:
pass
if not self.parent:
val = ''
elif not self.parent.name:
val = self.name + '\\'
else:
val = self.parent.abspath().rstrip('\\') + '\\' + self.name
self.cache_abspath = val
return val
# the local class will be unused soon enough
old_abspath = bld.node_class.abspath
bld.node_class.abspath = abspath
unc1 = '\\\\computer\\share\\file'
lst = Utils.split_path_win32(unc1)
node = bld.root.make_node(lst)
tt('UNC head node', lst[0], '\\\\computer')
tt('UNC share path', node.abspath(), unc1)
unc2 = '\\\\?\\C:\\foo'
lst = Utils.split_path_win32(unc2)
node = bld.root.make_node(lst)
tt('UNC long path', node.abspath(), 'C:\\foo')
if errors:
bld.fatal('There are test failures ^^')

View File

@ -3,17 +3,24 @@
top = '.'
out = 'tmp_out'
import os
from waflib import ConfigSet, Context, Build, Configure
import os, random, time
from waflib import ConfigSet, Context, Build, Configure, Utils
Configure.autoconfig='clobber'
def options(opt):
opt.add_option('--based', action='store', default='foo', help='base directory', dest='based')
opt.add_option('--dumpf', action='store', default='foo', help='dump config to this file', dest='dumpf')
opt.add_option('--force-autoconfig', action='store', default=False, dest='force_autoconfig')
def configure(ctx):
pass
# force autoconfig to depend on an always-changing file - once
if ctx.options.force_autoconfig:
fname = ctx.options.dumpf + '_autoconfig'
s = "%f%d" % (time.time(), random.randint(0, 10**9))
Utils.writef(fname, s)
ctx.files.append(fname)
Configure.autoconfig = False
def build(ctx):
pass

View File

@ -84,6 +84,10 @@ def build(bld):
test_cmd([exe], proj_cwd, ' next build from top')
test_cmd([exe], proj_out_cwd, ' next build from out', launch_dir='tmp_out')
test_cmd([exe], proj_sub_cwd, ' next build from subfolder', launch_dir='sub')
test_cmd([exe, '--top=%s' % proj_cwd, '--out=foobar'], proj_cwd,
' next build from top, verify out_dir==lock_file.out_dir')
test_cmd([exe, '--top=%s' % proj_cwd, '--out=foobar'], proj_sub_cwd,
' next build from subfolder, verify out_dir==lock_file.out_dir', launch_dir='sub')
cleanup(bld)
test_cmd([exe, 'configure', '--top=%s' % proj_cwd, '--out=%s' % proj_out_cwd], up_cwd, 'configure with top/out from up cwd',
@ -115,6 +119,20 @@ def build(bld):
test_cmd([exe, '--top=%s' % proj_cwd], proj_cwd, 'autoconfig')
cleanup(bld)
test_cmd([wscript, 'configure', '--top=project', '--out=project/tmp_out'], up_cwd, 'wscript configure with relative top/out from up cwd',
launch_dir='..')
test_cmd([wscript], proj_cwd, ' next build from top')
test_cmd([wscript], proj_out_cwd, ' next build from out', launch_dir='tmp_out')
test_cmd([wscript], proj_sub_cwd, ' next build from subfolder', launch_dir='sub')
cleanup(bld)
test_cmd([exe, '--force-autoconfig', '--top=project'], up_cwd, 'autoconfig from up 1', launch_dir='..')
os.remove(dumpf_default + '_autoconfig')
test_cmd([exe, '--force-autoconfig', '--top=project'], up_cwd, 'autoconfig from up 2', launch_dir='..')
os.remove(dumpf_default + '_autoconfig')
test_cmd([exe, '--force-autoconfig', '--out=badout'], proj_cwd, 'autoconfig with clobber')
cleanup(bld)
if failures:
bld.fatal('there were errors')

66
tests/post/wscript Normal file
View File

@ -0,0 +1,66 @@
#! /usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2016-2017 (ita)
top = '.'
out = 'build'
import inspect
from waflib import Utils, Logs, TaskGen
@TaskGen.taskgen_method
def log(self):
fname = inspect.stack()[1][3]
try:
self.called.append(fname)
except AttributeError:
self.called = [fname]
@TaskGen.taskgen_method
def check(self):
self.post()
result = ''.join(self.called)
if result == self.expected:
color = 'GREEN'
else:
color = 'RED'
result = 'got %r but expected %r' % (result, self.expected)
self.bld.failure = 1
Logs.pprint(color, result)
@TaskGen.feature('test1')
@TaskGen.after('d')
def a(self):
self.log()
@TaskGen.feature('test1')
@TaskGen.after('c')
def b(self):
self.log()
@TaskGen.feature('test1')
def c(self):
self.log()
@TaskGen.feature('test1')
def d(self):
self.log()
@TaskGen.feature('test1')
@TaskGen.after('f')
def e(self):
self.log()
@TaskGen.feature('test1')
def f(self):
self.log()
def configure(conf):
pass
def build(bld):
bld.failure = 0
def stop_status(bld):
if bld.failure:
bld.fatal('One or several test failed, check the outputs above')
bld.add_post_fun(stop_status)
bld(features='test1', expected='cbdafe').check()

View File

@ -16,12 +16,6 @@ from waflib.Logs import pprint
def configure(conf):
pass
def trimquotes(s):
if not s: return ''
s = s.rstrip()
if s[0] == "'" and s[-1] == "'": return s[1:-1]
return s
def build(bld):
bld.failure = 0
@ -193,9 +187,9 @@ def build(bld):
disp(color, "%s\t\t%r" % (expected, gruik.nodes))
test_rec("", "a")
test_rec("FOO=1", "aca")
test_rec("BAR=1", "abca")
test_rec("FOO=1 BAR=1", "aca")
test_rec("FOO=1", "ac")
test_rec("BAR=1", "abc")
test_rec("FOO=1 BAR=1", "ac")
return
test("1?1,(0?5:9):3,4", 0) # <- invalid expression

39
tests/utils/wscript Normal file
View File

@ -0,0 +1,39 @@
#! /usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2012 (ita)
VERSION='0.0.1'
APPNAME='preproc_test'
top = '.'
out = 'build'
from waflib import Utils
from waflib.Logs import pprint
def configure(conf):
pass
def build(bld):
bld.failure = 0
def disp(color, result):
pprint(color, result)
if color == 'RED':
bld.failure=1
def stop_status(bld):
if bld.failure:
bld.fatal('One or several test failed, check the outputs above')
bld.add_post_fun(stop_status)
def test_shell(inp, expected):
ret = Utils.shell_escape(inp)
if ret == expected:
color = "GREEN"
else:
color = "RED"
disp(color, "%r -> %r\t\texpected: %r" % (inp, ret, expected))
test_shell("ls -l", "ls -l")
test_shell(['ls', '-l', 'a space'], "ls -l 'a space'")

View File

@ -1,10 +1,10 @@
#! /usr/bin/env python
# encoding: ISO8859-1
# encoding: utf-8
# Thomas Nagy, 2014-2015
"""
A simple file for verifying signatures in signed waf files
This script is meant for Python >= 2.6 and the encoding is bytes - ISO8859-1
This script is meant for Python >= 2.6 and the encoding is bytes - latin-1
Distributing detached signatures is boring
"""
@ -33,14 +33,14 @@ if __name__ == '__main__':
try:
txt = f.read()
lastline = txt.decode('ISO8859-1').splitlines()[-1] # just the last line
lastline = txt.decode('latin-1').splitlines()[-1] # just the last line
if not lastline.startswith('#-----BEGIN PGP SIGNATURE-----'):
print("ERROR: there is no signature to verify in %r :-/" % infile)
sys.exit(1)
sigtext = lastline.replace('\\n', '\n') # convert newlines
sigtext = sigtext[1:] # omit the '# character'
sigtext = sigtext.encode('ISO8859-1') # python3
sigtext = sigtext.encode('latin-1') # python3
f2.write(sigtext)
f1.write(txt[:-len(lastline) - 1]) # one newline character was eaten from splitlines()

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: ISO8859-1
# Thomas Nagy, 2005-2016
# encoding: latin-1
# Thomas Nagy, 2005-2017
#
"""
Redistribution and use in source and binary forms, with or without
@ -32,7 +32,7 @@ POSSIBILITY OF SUCH DAMAGE.
import os, sys, inspect
VERSION="1.9.13"
VERSION="2.0.0"
REVISION="x"
GIT="x"
INSTALL="x"

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Classes related to the build phase (build, clean, install, step, etc)
@ -143,12 +143,6 @@ class BuildContext(Context.Context):
if not hasattr(self, v):
setattr(self, v, {})
def set_cur(self, cur):
self.current_group = cur
def get_cur(self):
return self.current_group
cur = property(get_cur, set_cur)
def get_variant_dir(self):
"""Getter for the variant_dir attribute"""
if not self.variant:
@ -181,30 +175,6 @@ class BuildContext(Context.Context):
self.add_to_group(ret, group=kw.get('group'))
return ret
def rule(self, *k, **kw):
"""
Wrapper for creating a task generator using the decorator notation. The following code::
@bld.rule(target="foo")
def _(tsk):
print("bar")
is equivalent to::
def bar(tsk):
print("bar")
bld(
target = "foo",
rule = bar,
)
"""
def f(rule):
ret = self(*k, **kw)
ret.rule = rule
return ret
return f
def __copy__(self):
"""
Build contexts cannot be copied
@ -374,15 +344,19 @@ class BuildContext(Context.Context):
try:
self.producer.start()
except KeyboardInterrupt:
self.store()
if self.is_dirty():
self.store()
raise
else:
if self.producer.dirty:
if self.is_dirty():
self.store()
if self.producer.error:
raise Errors.BuildError(self.producer.error)
def is_dirty(self):
return self.producer.dirty
def setup(self, tool, tooldir=None, funs=None):
"""
Import waf tools defined during the configuration::
@ -620,7 +594,7 @@ class BuildContext(Context.Context):
def add_to_group(self, tgen, group=None):
"""Adds a task or a task generator to the build; there is no attempt to remove it if it was already added."""
assert(isinstance(tgen, TaskGen.task_gen) or isinstance(tgen, Task.TaskBase))
assert(isinstance(tgen, TaskGen.task_gen) or isinstance(tgen, Task.Task))
tgen.bld = self
self.get_group(group).append(tgen)
@ -721,10 +695,16 @@ class BuildContext(Context.Context):
def get_targets(self):
"""
Returns the task generator corresponding to the 'targets' list; used internally
by :py:meth:`waflib.Build.BuildContext.get_build_iterator` to perform partial builds::
This method returns a pair containing the index of the last build group to post,
and the list of task generator objects corresponding to the target names.
This is used internally by :py:meth:`waflib.Build.BuildContext.get_build_iterator`
to perform partial builds::
$ waf --targets=myprogram,myshlib
:return: the minimum build group index, and list of task generators
:rtype: tuple
"""
to_post = []
min_grp = 0
@ -752,23 +732,21 @@ class BuildContext(Context.Context):
Post task generators from the group indexed by self.current_group; used internally
by :py:meth:`waflib.Build.BuildContext.get_build_iterator`
"""
def tgpost(tg):
try:
f = tg.post
except AttributeError:
pass
else:
f()
if self.targets == '*':
for tg in self.groups[self.current_group]:
try:
f = tg.post
except AttributeError:
pass
else:
f()
tgpost(tg)
elif self.targets:
if self.current_group < self._min_grp:
for tg in self.groups[self.current_group]:
try:
f = tg.post
except AttributeError:
pass
else:
f()
tgpost(tg)
else:
for tg in self._exact_tg:
tg.post()
@ -781,20 +759,15 @@ class BuildContext(Context.Context):
Logs.warn('CWD %s is not under %s, forcing --targets=* (run distclean?)', ln.abspath(), self.srcnode.abspath())
ln = self.srcnode
for tg in self.groups[self.current_group]:
try:
f = tg.post
except AttributeError:
pass
else:
if tg.path.is_child_of(ln):
f()
if tg.path.is_child_of(ln):
tgpost(tg)
def get_tasks_group(self, idx):
"""
Returns all task instances for the build group at position idx,
used internally by :py:meth:`waflib.Build.BuildContext.get_build_iterator`
:rtype: list of :py:class:`waflib.Task.TaskBase`
:rtype: list of :py:class:`waflib.Task.Task`
"""
tasks = []
for tg in self.groups[idx]:
@ -809,27 +782,23 @@ class BuildContext(Context.Context):
Creates a Python generator object that returns lists of tasks that may be processed in parallel.
:return: tasks which can be executed immediately
:rtype: generator returning lists of :py:class:`waflib.Task.TaskBase`
:rtype: generator returning lists of :py:class:`waflib.Task.Task`
"""
self.current_group = 0
if self.targets and self.targets != '*':
(self._min_grp, self._exact_tg) = self.get_targets()
global lazy_post
if self.post_mode != POST_LAZY:
while self.current_group < len(self.groups):
for self.current_group, _ in enumerate(self.groups):
self.post_group()
self.current_group += 1
self.current_group = 0
while self.current_group < len(self.groups):
for self.current_group, _ in enumerate(self.groups):
# first post the task generators for the group
if self.post_mode != POST_AT_ONCE:
self.post_group()
# then extract the tasks
tasks = self.get_tasks_group(self.current_group)
# if the constraints are set properly (ext_in/ext_out, before/after)
# the call to set_file_constraints may be removed (can be a 15% penalty on no-op rebuilds)
# (but leave set_file_constraints for the installation step)
@ -842,7 +811,6 @@ class BuildContext(Context.Context):
self.cur_tasks = tasks
if tasks:
yield tasks
self.current_group += 1
while 1:
# the build stops once there are no tasks to process
@ -1285,22 +1253,6 @@ class UninstallContext(InstallContext):
super(UninstallContext, self).__init__(**kw)
self.is_install = UNINSTALL
def execute(self):
"""
See :py:func:`waflib.Build.BuildContext.execute`.
"""
# TODO just mark the tasks are already run with hasrun=Task.SKIPPED?
try:
# do not execute any tasks
def runnable_status(self):
return Task.SKIP_ME
setattr(Task.Task, 'runnable_status_back', Task.Task.runnable_status)
setattr(Task.Task, 'runnable_status', runnable_status)
super(UninstallContext, self).execute()
finally:
setattr(Task.Task, 'runnable_status', Task.Task.runnable_status_back)
class CleanContext(BuildContext):
'''cleans the project'''
cmd = 'clean'
@ -1433,17 +1385,17 @@ class StepContext(BuildContext):
for pat in self.files.split(','):
matcher = self.get_matcher(pat)
for tg in g:
if isinstance(tg, Task.TaskBase):
if isinstance(tg, Task.Task):
lst = [tg]
else:
lst = tg.tasks
for tsk in lst:
do_exec = False
for node in getattr(tsk, 'inputs', []):
for node in tsk.inputs:
if matcher(node, output=False):
do_exec = True
break
for node in getattr(tsk, 'outputs', []):
for node in tsk.outputs:
if matcher(node, output=True):
do_exec = True
break
@ -1479,9 +1431,9 @@ class StepContext(BuildContext):
pattern = re.compile(pat)
def match(node, output):
if output == True and not out:
if output and not out:
return False
if output == False and not inn:
if not output and not inn:
return False
if anode:

6
waflib/ConfigSet.py Executable file → Normal file
View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
@ -88,13 +88,13 @@ class ConfigSet(object):
def __setitem__(self, key, value):
"""
Dictionary interface: set value for key
Dictionary interface: set value from key
"""
self.table[key] = value
def __delitem__(self, key):
"""
Dictionary interface: mark the key as missing
Dictionary interface: mark the value as missing
"""
self[key] = []

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Configuration system
@ -12,7 +12,7 @@ A :py:class:`waflib.Configure.ConfigurationContext` instance is created when ``w
* hold configuration routines such as ``find_program``, etc
"""
import os, shlex, sys, time, re, shutil
import os, re, shlex, shutil, sys, time, traceback
from waflib import ConfigSet, Utils, Options, Logs, Context, Build, Errors
WAF_CONFIG_LOG = 'config.log'
@ -197,17 +197,17 @@ class ConfigurationContext(Context.Context):
"""
if not env.PREFIX:
if Options.options.prefix or Utils.is_win32:
env.PREFIX = Utils.sane_path(Options.options.prefix)
env.PREFIX = Options.options.prefix
else:
env.PREFIX = ''
env.PREFIX = '/'
if not env.BINDIR:
if Options.options.bindir:
env.BINDIR = Utils.sane_path(Options.options.bindir)
env.BINDIR = Options.options.bindir
else:
env.BINDIR = Utils.subst_vars('${PREFIX}/bin', env)
if not env.LIBDIR:
if Options.options.libdir:
env.LIBDIR = Utils.sane_path(Options.options.libdir)
env.LIBDIR = Options.options.libdir
else:
env.LIBDIR = Utils.subst_vars('${PREFIX}/lib%s' % Utils.lib64(), env)
@ -258,7 +258,7 @@ class ConfigurationContext(Context.Context):
self.fatal('Could not load the Waf tool %r from %r\n%s' % (tool, sys.path, e))
except Exception as e:
self.to_log('imp %r (%r & %r)' % (tool, tooldir, funs))
self.to_log(Utils.ex_stack())
self.to_log(traceback.format_exc())
raise
if funs is not None:
@ -428,6 +428,7 @@ def find_program(self, filename, **kw):
:type msg: string
:param interpreter: interpreter for the program
:type interpreter: ConfigSet variable key
:raises: :py:class:`waflib.Errors.ConfigurationError`
"""
exts = kw.get('exts', Utils.is_win32 and '.exe,.com,.bat,.cmd' or ',.sh,.pl,.py')
@ -586,7 +587,7 @@ def run_build(self, *k, **kw):
try:
bld.compile()
except Errors.WafError:
ret = 'Test does not build: %s' % Utils.ex_stack()
ret = 'Test does not build: %s' % traceback.format_exc()
self.fatal(ret)
else:
ret = getattr(bld, 'retval', 0)

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2010-2016 (ita)
# Thomas Nagy, 2010-2017 (ita)
"""
Classes and functions enabling the command system
@ -11,16 +11,16 @@ from waflib import Utils, Errors, Logs
import waflib.Node
# the following 3 constants are updated on each new release (do not touch)
HEXVERSION=0x1090d00
HEXVERSION=0x2000000
"""Constant updated on new releases"""
WAFVERSION="1.9.13"
WAFVERSION="2.0.0"
"""Constant updated on new releases"""
WAFREVISION="66f1d5bc9472557082b82aced1cbf6a9d6cd1a27"
WAFREVISION="3cee7b36f04b13bcd255ea314ad7797246c8725b"
"""Git revision when the waf version is updated"""
ABI = 99
ABI = 20
"""Version of the build data cache file format (used in :py:const:`waflib.Context.DBFILE`)"""
DBFILE = '.wafpickle-%s-%d-%d' % (sys.platform, sys.hexversion, ABI)
@ -81,7 +81,6 @@ def create_context(cmd_name, *k, **kw):
:return: Context object
:rtype: :py:class:`waflib.Context.Context`
"""
global classes
for x in classes:
if x.cmd == cmd_name:
return x(*k, **kw)
@ -110,7 +109,6 @@ class store_context(type):
if not getattr(cls, 'fun', None):
cls.fun = cls.cmd
global classes
classes.insert(0, cls)
ctx = store_context('ctx', (object,), {})
@ -150,7 +148,6 @@ class Context(ctx):
try:
rd = kw['run_dir']
except KeyError:
global run_dir
rd = run_dir
# binds the context to the nodes in use to avoid a context singleton
@ -201,7 +198,6 @@ class Context(ctx):
Here, it calls the function name in the top-level wscript file. Most subclasses
redefine this method to provide additional functionality.
"""
global g_module
self.recurse([os.path.dirname(g_module.root_path)])
def pre_recurse(self, node):
@ -296,6 +292,15 @@ class Context(ctx):
raise Errors.WafError('Cannot read the folder %r' % d)
raise Errors.WafError('No wscript file in directory %s' % d)
def log_command(self, cmd, kw):
if Logs.verbose:
fmt = os.environ.get('WAF_CMD_FORMAT')
if fmt == 'string':
if not isinstance(cmd, str):
cmd = Utils.shell_escape(cmd)
Logs.debug('runner: %r', cmd)
Logs.debug('runner_env: kw=%s', kw)
def exec_command(self, cmd, **kw):
"""
Runs an external process and returns the exit status::
@ -314,11 +319,12 @@ class Context(ctx):
:type kw: dict
:returns: process exit status
:rtype: integer
:raises: :py:class:`waflib.Errors.WafError` if an invalid executable is specified for a non-shell process
:raises: :py:class:`waflib.Errors.WafError` in case of execution failure
"""
subprocess = Utils.subprocess
kw['shell'] = isinstance(cmd, str)
Logs.debug('runner: %r', cmd)
Logs.debug('runner_env: kw=%s', kw)
self.log_command(cmd, kw)
if self.logger:
self.logger.info(cmd)
@ -355,14 +361,14 @@ class Context(ctx):
if out:
if not isinstance(out, str):
out = out.decode(sys.stdout.encoding or 'iso8859-1', errors='replace')
out = out.decode(sys.stdout.encoding or 'latin-1', errors='replace')
if self.logger:
self.logger.debug('out: %s', out)
else:
Logs.info(out, extra={'stream':sys.stdout, 'c1': ''})
if err:
if not isinstance(err, str):
err = err.decode(sys.stdout.encoding or 'iso8859-1', errors='replace')
err = err.decode(sys.stdout.encoding or 'latin-1', errors='replace')
if self.logger:
self.logger.error('err: %s' % err)
else:
@ -374,7 +380,7 @@ class Context(ctx):
"""
Executes a process and returns stdout/stderr if the execution is successful.
An exception is thrown when the exit status is non-0. In that case, both stderr and stdout
will be bound to the WafError object::
will be bound to the WafError object (configuration tests)::
def configure(conf):
out = conf.cmd_and_log(['echo', 'hello'], output=waflib.Context.STDOUT, quiet=waflib.Context.BOTH)
@ -382,7 +388,7 @@ class Context(ctx):
(out, err) = conf.cmd_and_log(cmd, input='\\n'.encode(), output=waflib.Context.STDOUT)
try:
conf.cmd_and_log(['which', 'someapp'], output=waflib.Context.BOTH)
except Exception as e:
except Errors.WafError as e:
print(e.stdout, e.stderr)
:param cmd: args for subprocess.Popen
@ -396,7 +402,7 @@ class Context(ctx):
"""
subprocess = Utils.subprocess
kw['shell'] = isinstance(cmd, str)
Logs.debug('runner: %r', cmd)
self.log_command(cmd, kw)
if 'quiet' in kw:
quiet = kw['quiet']
@ -440,9 +446,9 @@ class Context(ctx):
raise Errors.WafError('Execution failure: %s' % str(e), ex=e)
if not isinstance(out, str):
out = out.decode(sys.stdout.encoding or 'iso8859-1', errors='replace')
out = out.decode(sys.stdout.encoding or 'latin-1', errors='replace')
if not isinstance(err, str):
err = err.decode(sys.stdout.encoding or 'iso8859-1', errors='replace')
err = err.decode(sys.stdout.encoding or 'latin-1', errors='replace')
if out and quiet != STDOUT and quiet != BOTH:
self.to_log('out: %s' % out)
@ -583,9 +589,9 @@ class Context(ctx):
result = kw.get('result') or k[0]
defcolor = 'GREEN'
if result == True:
if result is True:
msg = 'ok'
elif result == False:
elif not result:
msg = 'not found'
defcolor = 'YELLOW'
else:
@ -614,7 +620,6 @@ class Context(ctx):
:param ban: list of exact file names to exclude
:type ban: list of string
"""
global waf_dir
if os.path.isdir(waf_dir):
lst = self.root.find_node(waf_dir).find_node('waflib/extras').ant_glob(var)
for x in lst:

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2010-2016 (ita)
# Thomas Nagy, 2010-2017 (ita)
"""
Exceptions used in the Waf code

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
logging, colors, terminal width and pretty-print
@ -139,7 +139,6 @@ class log_filter(logging.Filter):
:param rec: log entry
"""
global verbose
rec.zone = rec.module
if rec.levelno >= logging.INFO:
return True
@ -253,18 +252,15 @@ def debug(*k, **kw):
"""
Wraps logging.debug and discards messages if the verbosity level :py:attr:`waflib.Logs.verbose` 0
"""
global verbose
if verbose:
k = list(k)
k[0] = k[0].replace('\n', ' ')
global log
log.debug(*k, **kw)
def error(*k, **kw):
"""
Wrap logging.errors, adds the stack trace when the verbosity level :py:attr:`waflib.Logs.verbose` 2
"""
global log, verbose
log.error(*k, **kw)
if verbose > 2:
st = traceback.extract_stack()
@ -282,14 +278,12 @@ def warn(*k, **kw):
"""
Wraps logging.warn
"""
global log
log.warn(*k, **kw)
def info(*k, **kw):
"""
Wraps logging.info
"""
global log
log.info(*k, **kw)
def init_log():
@ -381,6 +375,5 @@ def pprint(col, msg, label='', sep='\n'):
:param sep: a string to append at the end (line separator)
:type sep: string
"""
global info
info('%s%s%s %s', colors(col), msg, colors.NORMAL, label, extra={'terminator':sep})

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Node: filesystem structure
@ -122,7 +122,7 @@ class Node(object):
"""
raise Errors.WafError('nodes are not supposed to be copied')
def read(self, flags='r', encoding='ISO8859-1'):
def read(self, flags='r', encoding='latin-1'):
"""
Reads and returns the contents of the file represented by this node, see :py:func:`waflib.Utils.readf`::
@ -138,7 +138,7 @@ class Node(object):
"""
return Utils.readf(self.abspath(), flags, encoding)
def write(self, data, flags='w', encoding='ISO8859-1'):
def write(self, data, flags='w', encoding='latin-1'):
"""
Writes data to the file represented by this node, see :py:func:`waflib.Utils.writef`::
@ -341,6 +341,11 @@ class Node(object):
if isinstance(lst, str):
lst = [x for x in Utils.split_path(lst) if x and x != '.']
if lst and lst[0].startswith('\\\\') and not self.parent:
node = self.ctx.root.make_node(lst[0])
node.cache_isdir = True
return node.find_node(lst[1:])
cur = self
for x in lst:
if x == '..':
@ -565,9 +570,8 @@ class Node(object):
if isdir:
if dir:
yield node
else:
if src:
yield node
elif src:
yield node
if isdir:
node.cache_isdir = True
@ -611,26 +615,25 @@ class Node(object):
:param ignorecase: ignore case while matching (False by default)
:type ignorecase: bool
:returns: The corresponding Nodes
:rtype: list of :py:class:`waflib.Node.Node` instances
:type generator: bool
:returns: Whether to evaluate the Nodes lazily, alters the type of the returned value
:rtype: by default, list of :py:class:`waflib.Node.Node` instances
"""
src = kw.get('src', True)
dir = kw.get('dir', False)
dir = kw.get('dir')
excl = kw.get('excl', exclude_regs)
incl = k and k[0] or kw.get('incl', '**')
reflags = kw.get('ignorecase', 0) and re.I
reflags = kw.get('ignorecase', re.I)
def to_pat(s):
lst = Utils.to_list(s)
ret = []
for x in lst:
for x in Utils.to_list(s):
x = x.replace('\\', '/').replace('//', '/')
if x.endswith('/'):
x += '**'
lst2 = x.split('/')
accu = []
for k in lst2:
for k in x.split('/'):
if k == '**':
accu.append(k)
else:
@ -638,9 +641,11 @@ class Node(object):
k = '^%s$' % k
try:
#print "pattern", k
accu.append(re.compile(k, flags=reflags))
exp = re.compile(k, flags=reflags)
except Exception as e:
raise Errors.WafError('Invalid pattern: %s' % k, e)
else:
accu.append(exp)
ret.append(accu)
return ret
@ -667,16 +672,17 @@ class Node(object):
nacc = []
return [nacc, nrej]
ret = [x for x in self.ant_iter(accept=accept, pats=[to_pat(incl), to_pat(excl)], maxdepth=kw.get('maxdepth', 25), dir=dir, src=src, remove=kw.get('remove', True))]
if kw.get('flat', False):
return ' '.join([x.path_from(self) for x in ret])
it = self.ant_iter(accept=accept, pats=[to_pat(incl), to_pat(excl)], maxdepth=kw.get('maxdepth', 25), dir=dir, src=src, remove=kw.get('remove', True))
if kw.get('flat'):
# returns relative paths as a space-delimited string
# prefer Node objects whenever possible
return ' '.join(x.path_from(self) for x in it)
elif kw.get('generator'):
return it
return list(it)
return ret
# --------------------------------------------------------------------------------
# the following methods require the source/build folders (bld.srcnode/bld.bldnode)
# using a subclass is a possibility, but is that really necessary?
# --------------------------------------------------------------------------------
# ----------------------------------------------------------------------------
# the methods below require the source/build folders (bld.srcnode/bld.bldnode)
def is_src(self):
"""
@ -781,29 +787,19 @@ class Node(object):
def find_or_declare(self, lst):
"""
Use this method in the build phase to declare output files.
Use this method in the build phase to declare output files which
are meant to be written in the build directory.
If 'self' is in build directory, it first tries to return an existing node object.
If no Node is found, it tries to find one in the source directory.
If no Node is found, a new Node object is created in the build directory, and the
intermediate folders are added.
This method creates the Node object and its parent folder
as needed.
:param lst: relative path
:type lst: string or list of string
"""
if isinstance(lst, str):
lst = [x for x in Utils.split_path(lst) if x and x != '.']
node = self.get_bld().search_node(lst)
if node:
if not os.path.isfile(node.abspath()):
node.parent.mkdir()
return node
self = self.get_src()
node = self.find_node(lst)
if node:
return node
node = self.get_bld().make_node(lst)
if isinstance(lst, str) and os.path.isabs(lst):
node = self.ctx.root.make_node(lst)
else:
node = self.get_bld().make_node(lst)
node.parent.mkdir()
return node
@ -920,19 +916,6 @@ class Node(object):
raise
return ret
# --------------------------------------------
# TODO waf 2.0, remove the sig and cache_sig attributes
def get_sig(self):
return self.h_file()
def set_sig(self, val):
# clear the cache, so that past implementation should still work
try:
del self.get_bld_sig.__cache__[(self,)]
except (AttributeError, KeyError):
pass
sig = property(get_sig, set_sig)
cache_sig = property(get_sig, set_sig)
pickle_lock = Utils.threading.Lock()
"""Lock mandatory for thread-safe node serialization"""

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# Scott Newton, 2005 (scottn)
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
Support for waf command-line options
@ -13,7 +13,7 @@ that reads the ``options`` wscript function.
import os, tempfile, optparse, sys, re
from waflib import Logs, Utils, Context, Errors
options = {}
options = optparse.Values()
"""
A global dictionary representing user-provided command-line options::
@ -42,11 +42,25 @@ class opt_parser(optparse.OptionParser):
"""
Command-line options parser.
"""
def __init__(self, ctx):
optparse.OptionParser.__init__(self, conflict_handler="resolve",
def __init__(self, ctx, allow_unknown=False):
optparse.OptionParser.__init__(self, conflict_handler='resolve', add_help_option=False,
version='waf %s (%s)' % (Context.WAFVERSION, Context.WAFREVISION))
self.formatter.width = Logs.get_term_cols()
self.ctx = ctx
self.allow_unknown = allow_unknown
def _process_args(self, largs, rargs, values):
"""
Custom _process_args to allow unknown options according to the allow_unknown status
"""
while rargs:
try:
optparse.OptionParser._process_args(self,largs,rargs,values)
except (optparse.BadOptionError, optparse.AmbiguousOptionError) as e:
if self.allow_unknown:
largs.append(e.opt_str)
else:
self.error(str(e))
def print_usage(self, file=None):
return self.print_help(file)
@ -118,6 +132,7 @@ class OptionsContext(Context.Context):
p('-v', '--verbose', dest='verbose', default=0, action='count', help='verbosity level -v -vv or -vvv [default: 0]')
p('--zones', dest='zones', default='', action='store', help='debugging zones (task_gen, deps, tasks, etc)')
p('--profile', dest='profile', default='', action='store_true', help=optparse.SUPPRESS_HELP)
p('-h', '--help', dest='whelp', default=0, action='store_true', help="show this help message and exit")
gr = self.add_option_group('Configuration options')
self.option_groups['configure options'] = gr
@ -243,31 +258,78 @@ class OptionsContext(Context.Context):
return group
return None
def parse_args(self, _args=None):
"""
Parses arguments from a list which is not necessarily the command-line.
def sanitize_path(self, path, cwd=None):
if not cwd:
cwd = Context.launch_dir
p = os.path.expanduser(path)
p = os.path.join(cwd, p)
p = os.path.normpath(p)
p = os.path.abspath(p)
return p
:param _args: arguments
:type _args: list of strings
def parse_cmd_args(self, _args=None, cwd=None, allow_unknown=False):
"""
global options, commands, envvars
Just parse the arguments
"""
self.parser.allow_unknown = allow_unknown
(options, leftover_args) = self.parser.parse_args(args=_args)
envvars = []
commands = []
for arg in leftover_args:
if '=' in arg:
envvars.append(arg)
else:
elif arg != 'options':
commands.append(arg)
if options.destdir:
options.destdir = Utils.sane_path(options.destdir)
for name in 'top out destdir prefix bindir libdir'.split():
# those paths are usually expanded from Context.launch_dir
if getattr(options, name, None):
path = self.sanitize_path(getattr(options, name), cwd)
setattr(options, name, path)
return options, commands, envvars
def init_module_vars(self, arg_options, arg_commands, arg_envvars):
options.__dict__.clear()
del commands[:]
del envvars[:]
options.__dict__.update(arg_options.__dict__)
commands.extend(arg_commands)
envvars.extend(arg_envvars)
for var in envvars:
(name, value) = var.split('=', 1)
os.environ[name.strip()] = value
def init_logs(self, options, commands, envvars):
if options.verbose >= 1:
self.load('errcheck')
colors = {'yes' : 2, 'auto' : 1, 'no' : 0}[options.colors]
Logs.enable_colors(colors)
if options.zones:
Logs.zones = options.zones.split(',')
if not Logs.verbose:
Logs.verbose = 1
elif Logs.verbose > 0:
Logs.zones = ['runner']
if Logs.verbose > 2:
Logs.zones = ['*']
def parse_args(self, _args=None):
"""
Parses arguments from a list which is not necessarily the command-line.
Initializes the module variables options, commands and envvars
If help is requested, prints it and exit the application
:param _args: arguments
:type _args: list of strings
"""
options, commands, envvars = self.parse_cmd_args()
self.init_logs(options, commands, envvars)
self.init_module_vars(options, commands, envvars)
def execute(self):
"""
See :py:func:`waflib.Context.Context.execute`

View File

@ -1,12 +1,12 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Runner.py: Task scheduling and execution
"""
import random
import heapq, traceback
try:
from queue import Queue
except ImportError:
@ -18,6 +18,32 @@ GAP = 20
Wait for at least ``GAP * njobs`` before trying to enqueue more tasks to run
"""
class PriorityTasks(object):
def __init__(self):
self.lst = []
def __len__(self):
return len(self.lst)
def __iter__(self):
return iter(self.lst)
def clear(self):
self.lst = []
def append(self, task):
heapq.heappush(self.lst, task)
def appendleft(self, task):
heapq.heappush(self.lst, task)
def pop(self):
return heapq.heappop(self.lst)
def extend(self, lst):
if self.lst:
for x in lst:
self.append(x)
else:
if isinstance(lst, list):
self.lst = lst
heapq.heapify(lst)
else:
self.lst = lst.lst
class Consumer(Utils.threading.Thread):
"""
Daemon thread object that executes a task. It shares a semaphore with
@ -38,7 +64,7 @@ class Consumer(Utils.threading.Thread):
"""
try:
if not self.spawner.master.stop:
self.task.process()
self.spawner.master.process_task(self.task)
finally:
self.spawner.sem.release()
self.spawner.master.out.put(self.task)
@ -49,7 +75,7 @@ class Spawner(Utils.threading.Thread):
"""
Daemon thread that consumes tasks from :py:class:`waflib.Runner.Parallel` producer and
spawns a consuming thread :py:class:`waflib.Runner.Consumer` for each
:py:class:`waflib.Task.TaskBase` instance.
:py:class:`waflib.Task.Task` instance.
"""
def __init__(self, master):
Utils.threading.Thread.__init__(self)
@ -102,22 +128,25 @@ class Parallel(object):
Instance of :py:class:`waflib.Build.BuildContext`
"""
self.outstanding = Utils.deque()
"""List of :py:class:`waflib.Task.TaskBase` that may be ready to be executed"""
self.outstanding = PriorityTasks()
"""Heap of :py:class:`waflib.Task.Task` that may be ready to be executed"""
self.frozen = Utils.deque()
"""List of :py:class:`waflib.Task.TaskBase` that are not ready yet"""
self.postponed = PriorityTasks()
"""Heap of :py:class:`waflib.Task.Task` which are not ready to run for non-DAG reasons"""
self.incomplete = set()
"""List of :py:class:`waflib.Task.Task` waiting for dependent tasks to complete (DAG)"""
self.ready = Queue(0)
"""List of :py:class:`waflib.Task.TaskBase` ready to be executed by consumers"""
"""List of :py:class:`waflib.Task.Task` ready to be executed by consumers"""
self.out = Queue(0)
"""List of :py:class:`waflib.Task.TaskBase` returned by the task consumers"""
"""List of :py:class:`waflib.Task.Task` returned by the task consumers"""
self.count = 0
"""Amount of tasks that may be processed by :py:class:`waflib.Runner.TaskConsumer`"""
self.processed = 1
self.processed = 0
"""Amount of tasks processed"""
self.stop = False
@ -134,6 +163,11 @@ class Parallel(object):
Flag that indicates that the build cache must be saved when a task was executed
(calls :py:meth:`waflib.Build.BuildContext.store`)"""
self.revdeps = Utils.defaultdict(set)
"""
The reverse dependency graph of dependencies obtained from Task.run_after
"""
self.spawner = Spawner(self)
"""
Coordinating daemon thread that spawns thread consumers
@ -143,28 +177,26 @@ class Parallel(object):
"""
Obtains the next Task instance to run
:rtype: :py:class:`waflib.Task.TaskBase`
:rtype: :py:class:`waflib.Task.Task`
"""
if not self.outstanding:
return None
return self.outstanding.popleft()
return self.outstanding.pop()
def postpone(self, tsk):
"""
Adds the task to the list :py:attr:`waflib.Runner.Parallel.frozen`.
Adds the task to the list :py:attr:`waflib.Runner.Parallel.postponed`.
The order is scrambled so as to consume as many tasks in parallel as possible.
:param tsk: task instance
:type tsk: :py:class:`waflib.Task.TaskBase`
:type tsk: :py:class:`waflib.Task.Task`
"""
if random.randint(0, 1):
self.frozen.appendleft(tsk)
else:
self.frozen.append(tsk)
self.postponed.append(tsk)
def refill_task_list(self):
"""
Adds the next group of tasks to execute in :py:attr:`waflib.Runner.Parallel.outstanding`.
Pulls a next group of tasks to execute in :py:attr:`waflib.Runner.Parallel.outstanding`.
Ensures that all tasks in the current build group are complete before processing the next one.
"""
while self.count > self.numjobs * GAP:
self.get_out()
@ -172,7 +204,9 @@ class Parallel(object):
while not self.outstanding:
if self.count:
self.get_out()
elif self.frozen:
if self.outstanding:
break
elif self.postponed:
try:
cond = self.deadlock == self.processed
except AttributeError:
@ -180,46 +214,97 @@ class Parallel(object):
else:
if cond:
msg = 'check the build order for the tasks'
for tsk in self.frozen:
for tsk in self.postponed:
if not tsk.run_after:
msg = 'check the methods runnable_status'
break
lst = []
for tsk in self.frozen:
for tsk in self.postponed:
lst.append('%s\t-> %r' % (repr(tsk), [id(x) for x in tsk.run_after]))
raise Errors.WafError('Deadlock detected: %s%s' % (msg, ''.join(lst)))
self.deadlock = self.processed
if self.frozen:
self.outstanding.extend(self.frozen)
self.frozen.clear()
if self.postponed:
self.outstanding.extend(self.postponed)
self.postponed.clear()
elif not self.count:
self.outstanding.extend(next(self.biter))
self.total = self.bld.total()
break
if self.incomplete:
for x in self.incomplete:
for k in x.run_after:
if not k.hasrun:
break
else:
# dependency added after the build started without updating revdeps
self.incomplete.remove(x)
self.outstanding.append(x)
break
else:
raise Errors.WafError('Broken revdeps detected on %r' % self.incomplete)
else:
tasks = next(self.biter)
ready, waiting = self.prio_and_split(tasks)
self.outstanding.extend(ready)
self.incomplete.update(waiting)
self.total = self.bld.total()
break
def add_more_tasks(self, tsk):
"""
If a task provides :py:attr:`waflib.Task.TaskBase.more_tasks`, then the tasks contained
If a task provides :py:attr:`waflib.Task.Task.more_tasks`, then the tasks contained
in that list are added to the current build and will be processed before the next build group.
The priorities for dependent tasks are not re-calculated globally
:param tsk: task instance
:type tsk: :py:attr:`waflib.Task.TaskBase`
:type tsk: :py:attr:`waflib.Task.Task`
"""
if getattr(tsk, 'more_tasks', None):
self.outstanding.extend(tsk.more_tasks)
# TODO recompute priorities globally?
ready, waiting = self.prio_and_split(tsk.more_tasks)
self.outstanding.extend(ready)
self.incomplete.update(waiting)
self.total += len(tsk.more_tasks)
def mark_finished(self, tsk):
def try_unfreeze(x):
# DAG ancestors are likely to be in the incomplete set
if x in self.incomplete:
# TODO remove dependencies to free some memory?
# x.run_after.remove(tsk)
for k in x.run_after:
if not k.hasrun:
break
else:
self.incomplete.remove(x)
self.outstanding.append(x)
if tsk in self.revdeps:
for x in self.revdeps[tsk]:
if isinstance(x, Task.TaskGroup):
x.prev.remove(tsk)
if not x.prev:
for k in x.next:
# TODO necessary optimization?
k.run_after.remove(x)
try_unfreeze(k)
# TODO necessary optimization?
x.next = []
else:
try_unfreeze(x)
del self.revdeps[tsk]
def get_out(self):
"""
Waits for a Task that task consumers add to :py:attr:`waflib.Runner.Parallel.out` after execution.
Adds more Tasks if necessary through :py:attr:`waflib.Runner.Parallel.add_more_tasks`.
:rtype: :py:attr:`waflib.Task.TaskBase`
:rtype: :py:attr:`waflib.Task.Task`
"""
tsk = self.out.get()
if not self.stop:
self.add_more_tasks(tsk)
self.mark_finished(tsk)
self.count -= 1
self.dirty = True
return tsk
@ -229,32 +314,42 @@ class Parallel(object):
Enqueue a Task to :py:attr:`waflib.Runner.Parallel.ready` so that consumers can run them.
:param tsk: task instance
:type tsk: :py:attr:`waflib.Task.TaskBase`
:type tsk: :py:attr:`waflib.Task.Task`
"""
self.ready.put(tsk)
def process_task(self, tsk):
"""
Processes a task and attempts to stop the build in case of errors
"""
tsk.process()
if tsk.hasrun != Task.SUCCESS:
self.error_handler(tsk)
def skip(self, tsk):
"""
Mark a task as skipped/up-to-date
"""
tsk.hasrun = Task.SKIPPED
self.mark_finished(tsk)
def cancel(self, tsk):
"""
Mark a task as failed because of unsatisfiable dependencies
"""
tsk.hasrun = Task.CANCELED
self.mark_finished(tsk)
def error_handler(self, tsk):
"""
Called when a task cannot be executed. The flag :py:attr:`waflib.Runner.Parallel.stop` is set, unless
the build is executed with::
Called when a task cannot be executed. The flag :py:attr:`waflib.Runner.Parallel.stop` is set,
unless the build is executed with::
$ waf build -k
:param tsk: task instance
:type tsk: :py:attr:`waflib.Task.TaskBase`
:type tsk: :py:attr:`waflib.Task.Task`
"""
if hasattr(tsk, 'scan') and hasattr(tsk, 'uid'):
# TODO waf 2.0 - this breaks encapsulation
try:
del self.bld.imp_sigs[tsk.uid()]
except KeyError:
pass
if not self.bld.keep:
self.stop = True
self.error.append(tsk)
@ -270,11 +365,11 @@ class Parallel(object):
return tsk.runnable_status()
except Exception:
self.processed += 1
tsk.err_msg = Utils.ex_stack()
tsk.err_msg = traceback.format_exc()
if not self.stop and self.bld.keep:
self.skip(tsk)
if self.bld.keep == 1:
# if -k stop at the first exception, if -kk try to go as far as possible
# if -k stop on the first exception, if -kk try to go as far as possible
if Logs.verbose > 1 or not self.error:
self.error.append(tsk)
self.stop = True
@ -282,9 +377,10 @@ class Parallel(object):
if Logs.verbose > 1:
self.error.append(tsk)
return Task.EXCEPTION
tsk.hasrun = Task.EXCEPTION
tsk.hasrun = Task.EXCEPTION
self.error_handler(tsk)
return Task.EXCEPTION
def start(self):
@ -316,10 +412,9 @@ class Parallel(object):
self.processed += 1
continue
if self.stop: # stop immediately after a failure was detected
if self.stop: # stop immediately after a failure is detected
break
st = self.task_status(tsk)
if st == Task.RUN_ME:
self.count += 1
@ -328,17 +423,24 @@ class Parallel(object):
if self.numjobs == 1:
tsk.log_display(tsk.generator.bld)
try:
tsk.process()
self.process_task(tsk)
finally:
self.out.put(tsk)
else:
self.add_task(tsk)
if st == Task.ASK_LATER:
elif st == Task.ASK_LATER:
self.postpone(tsk)
elif st == Task.SKIP_ME:
self.processed += 1
self.skip(tsk)
self.add_more_tasks(tsk)
elif st == Task.CANCEL_ME:
# A dependency problem has occured, and the
# build is most likely run with `waf -k`
if Logs.verbose > 1:
self.error.append(tsk)
self.processed += 1
self.cancel(tsk)
# self.count represents the tasks that have been made available to the consumer threads
# collect all the tasks after an error else the message may be incomplete
@ -346,5 +448,110 @@ class Parallel(object):
self.get_out()
self.ready.put(None)
assert (self.count == 0 or self.stop)
if not self.stop:
assert not self.count
assert not self.postponed
assert not self.incomplete
def prio_and_split(self, tasks):
"""
Label input tasks with priority values, and return a pair containing
the tasks that are ready to run and the tasks that are necessarily
waiting for other tasks to complete.
The priority system is really meant as an optional layer for optimization:
dependency cycles are found quickly, and builds should be more efficient.
A high priority number means that a task is processed first.
This method can be overridden to disable the priority system::
def prio_and_split(self, tasks):
return tasks, []
:return: A pair of task lists
:rtype: tuple
"""
# to disable:
#return tasks, []
for x in tasks:
x.visited = 0
reverse = self.revdeps
for x in tasks:
for k in x.run_after:
if isinstance(k, Task.TaskGroup):
if k.done:
pass
else:
k.done = True
for j in k.prev:
reverse[j].add(k)
else:
reverse[k].add(x)
# the priority number is not the tree depth
def visit(n):
if isinstance(n, Task.TaskGroup):
return sum(visit(k) for k in n.next)
if n.visited == 0:
n.visited = 1
if n in reverse:
rev = reverse[n]
n.prio_order = n.tree_weight + len(rev) + sum(visit(k) for k in rev)
else:
n.prio_order = n.tree_weight
n.visited = 2
elif n.visited == 1:
raise Errors.WafError('Dependency cycle found!')
return n.prio_order
for x in tasks:
if x.visited != 0:
# must visit all to detect cycles
continue
try:
visit(x)
except Errors.WafError:
self.debug_cycles(tasks, reverse)
ready = []
waiting = []
for x in tasks:
for k in x.run_after:
if not k.hasrun:
waiting.append(x)
break
else:
ready.append(x)
return (ready, waiting)
def debug_cycles(self, tasks, reverse):
tmp = {}
for x in tasks:
tmp[x] = 0
def visit(n, acc):
if isinstance(n, Task.TaskGroup):
for k in n.next:
visit(k, acc)
return
if tmp[n] == 0:
tmp[n] = 1
for k in reverse.get(n, []):
visit(k, [n] + acc)
tmp[n] = 2
elif tmp[n] == 1:
lst = []
for tsk in acc:
lst.append(repr(tsk))
if tsk is n:
# exclude prior nodes, we want the minimum cycle
break
raise Errors.WafError('Task dependency cycle in "run_after" constraints: %s' % ''.join(lst))
for x in tasks:
visit(x, [])

View File

@ -1,9 +1,11 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"Module called for configuring, compiling and installing targets"
from __future__ import with_statement
import os, shlex, shutil, traceback, errno, sys, stat
from waflib import Utils, Configure, Logs, Options, ConfigSet, Context, Errors, Build, Node
@ -24,59 +26,49 @@ def waf_entry_point(current_directory, version, wafdir):
:param wafdir: absolute path representing the directory of the waf library
:type wafdir: string
"""
Logs.init_log()
if Context.WAFVERSION != version:
Logs.error('Waf script %r and library %r do not match (directory %r)', version, Context.WAFVERSION, wafdir)
sys.exit(1)
if '--version' in sys.argv:
Context.run_dir = current_directory
ctx = Context.create_context('options')
ctx.curdir = current_directory
ctx.parse_args()
sys.exit(0)
# Store current directory before any chdir
Context.waf_dir = wafdir
Context.launch_dir = current_directory
Context.run_dir = Context.launch_dir = current_directory
start_dir = current_directory
no_climb = os.environ.get('NOCLIMB')
if len(sys.argv) > 1:
# os.path.join handles absolute paths in sys.argv[1] accordingly (it discards the previous ones)
# os.path.join handles absolute paths
# if sys.argv[1] is not an absolute path, then it is relative to the current working directory
potential_wscript = os.path.join(current_directory, sys.argv[1])
# maybe check if the file is executable
# perhaps extract 'wscript' as a constant
if os.path.basename(potential_wscript) == 'wscript' and os.path.isfile(potential_wscript):
if os.path.basename(potential_wscript) == Context.WSCRIPT_FILE and os.path.isfile(potential_wscript):
# need to explicitly normalize the path, as it may contain extra '/.'
# TODO abspath?
current_directory = os.path.normpath(os.path.dirname(potential_wscript))
path = os.path.normpath(os.path.dirname(potential_wscript))
start_dir = os.path.abspath(path)
no_climb = True
sys.argv.pop(1)
ctx = Context.create_context('options')
(options, commands, env) = ctx.parse_cmd_args(allow_unknown=True)
if options.top:
start_dir = Context.run_dir = Context.top_dir = options.top
no_climb = True
if options.out:
Context.out_dir = options.out
# if 'configure' is in the commands, do not search any further
no_climb = os.environ.get('NOCLIMB')
if not no_climb:
for k in no_climb_commands:
for y in sys.argv:
for y in commands:
if y.startswith(k):
no_climb = True
break
# if --top is provided assume the build started in the top directory
for i, x in enumerate(sys.argv):
# WARNING: this modifies sys.argv
if x.startswith('--top='):
Context.run_dir = Context.top_dir = Utils.sane_path(x[6:])
sys.argv[i] = '--top=' + Context.run_dir
if x.startswith('--out='):
Context.out_dir = Utils.sane_path(x[6:])
sys.argv[i] = '--out=' + Context.out_dir
# try to find a lock file (if the project was configured)
# at the same time, store the first wscript file seen
cur = current_directory
while cur and not Context.top_dir:
cur = start_dir
while cur:
try:
lst = os.listdir(cur)
except OSError:
@ -131,14 +123,11 @@ def waf_entry_point(current_directory, version, wafdir):
break
if not Context.run_dir:
if '-h' in sys.argv or '--help' in sys.argv:
Logs.warn('No wscript file found: the help message may be incomplete')
Context.run_dir = current_directory
ctx = Context.create_context('options')
ctx.curdir = current_directory
ctx.parse_args()
if options.whelp:
Logs.warn('These are the generic options (no wscript/project found)')
ctx.parser.print_help()
sys.exit(0)
Logs.error('Waf: Run from a directory containing a file named %r', Context.WSCRIPT_FILE)
Logs.error('Waf: Run from a folder containing a %r file (or try -h for the generic options)', Context.WSCRIPT_FILE)
sys.exit(1)
try:
@ -158,7 +147,7 @@ def waf_entry_point(current_directory, version, wafdir):
traceback.print_exc(file=sys.stdout)
sys.exit(2)
if '--profile' in sys.argv:
if options.profile:
import cProfile, pstats
cProfile.runctx('from waflib import Scripting; Scripting.run_commands()', {}, {}, 'profi.txt')
p = pstats.Stats('profi.txt')
@ -214,29 +203,13 @@ def parse_options():
Parses the command-line options and initialize the logging system.
Called by :py:func:`waflib.Scripting.waf_entry_point` during the initialization.
"""
Context.create_context('options').execute()
for var in Options.envvars:
(name, value) = var.split('=', 1)
os.environ[name.strip()] = value
ctx = Context.create_context('options')
ctx.execute()
if not Options.commands:
Options.commands = [default_cmd]
Options.commands = [x for x in Options.commands if x != 'options'] # issue 1076
# process some internal Waf options
Logs.verbose = Options.options.verbose
#Logs.init_log()
if Options.options.zones:
Logs.zones = Options.options.zones.split(',')
if not Logs.verbose:
Logs.verbose = 1
elif Logs.verbose > 0:
Logs.zones = ['runner']
if Logs.verbose > 2:
Logs.zones = ['*']
Options.commands.append(default_cmd)
if Options.options.whelp:
ctx.parser.print_help()
sys.exit(0)
def run_command(cmd_name):
"""
@ -302,7 +275,7 @@ def distclean_dir(dirname):
pass
def distclean(ctx):
'''removes the build directory'''
'''removes build folders and data'''
def remove_and_log(k, fun):
try:
@ -314,7 +287,7 @@ def distclean(ctx):
# remove waf cache folders on the top-level
if not Options.commands:
for k in os.listdir('.'):
for x in '.waf-1. waf-1. .waf3-1. waf3-1.'.split():
for x in '.waf-2 waf-2 .waf3-2 waf3-2'.split():
if k.startswith(x):
remove_and_log(k, shutil.rmtree)
@ -350,7 +323,6 @@ def distclean(ctx):
p = os.path.join(k, Options.lockfile)
remove_and_log(p, os.remove)
class Dist(Context.Context):
'''creates an archive containing the project source code'''
cmd = 'dist'
@ -404,11 +376,11 @@ class Dist(Context.Context):
self.fatal('Valid algo types are tar.bz2, tar.gz, tar.xz or zip')
try:
from hashlib import sha1
from hashlib import sha256
except ImportError:
digest = ''
else:
digest = ' (sha=%r)' % sha1(node.read(flags='rb')).hexdigest()
digest = ' (sha256=%r)' % sha256(node.read(flags='rb')).hexdigest()
Logs.info('New archive created: %s%s', self.arch_name, digest)
@ -437,11 +409,8 @@ class Dist(Context.Context):
tinfo.gname = 'root'
if os.path.isfile(p):
fu = open(p, 'rb')
try:
tar.addfile(tinfo, fileobj=fu)
finally:
fu.close()
with open(p, 'rb') as f:
tar.addfile(tinfo, fileobj=f)
else:
tar.addfile(tinfo)
@ -503,7 +472,7 @@ class Dist(Context.Context):
try:
return self.excl
except AttributeError:
self.excl = Node.exclude_regs + ' **/waf-1.* **/.waf-1.* **/waf3-1.* **/.waf3-1.* **/*~ **/*.rej **/*.orig **/*.pyc **/*.pyo **/*.bak **/*.swp **/.lock-w*'
self.excl = Node.exclude_regs + ' **/waf-2.* **/.waf-2.* **/waf3-2.* **/.waf3-2.* **/*~ **/*.rej **/*.orig **/*.pyc **/*.pyo **/*.bak **/*.swp **/.lock-w*'
if Context.out_dir:
nd = self.root.find_node(Context.out_dir)
if nd:
@ -536,11 +505,7 @@ def dist(ctx):
pass
class DistCheck(Dist):
"""
Creates an archive of the project, then attempts to build the project in a temporary directory::
$ waf distcheck
"""
"""creates an archive with dist, then tries to build it"""
fun = 'distcheck'
cmd = 'distcheck'
@ -567,12 +532,9 @@ class DistCheck(Dist):
"""
import tempfile, tarfile
try:
t = tarfile.open(self.get_arch_name())
with tarfile.open(self.get_arch_name()) as t:
for x in t:
t.extract(x)
finally:
t.close()
instdir = tempfile.mkdtemp('.inst', self.get_base_name())
cmd = self.make_distcheck_cmd(instdir)

View File

@ -1,12 +1,12 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Tasks represent atomic operations such as processes.
"""
import os, re, sys, tempfile
import os, re, sys, tempfile, traceback
from waflib import Utils, Logs, Errors
# task states
@ -22,6 +22,9 @@ CRASHED = 2
EXCEPTION = 3
"""An exception occurred in the task execution"""
CANCELED = 4
"""A dependency for the task is missing so it was cancelled"""
SKIPPED = 8
"""The task did not have to be executed"""
@ -37,6 +40,9 @@ SKIP_ME = -2
RUN_ME = -3
"""The task must be executed"""
CANCEL_ME = -4
"""The task cannot be executed because of a dependency problem"""
COMPILE_TEMPLATE_SHELL = '''
def f(tsk):
env = tsk.env
@ -86,8 +92,7 @@ class store_task_type(type):
super(store_task_type, cls).__init__(name, bases, dict)
name = cls.__name__
if name != 'evil' and name != 'TaskBase':
global classes
if name != 'evil' and name != 'Task':
if getattr(cls, 'run_str', None):
# if a string is provided, convert it to a method
(f, dvars) = compile_fun(cls.run_str, cls.shell)
@ -108,20 +113,21 @@ class store_task_type(type):
evil = store_task_type('evil', (object,), {})
"Base class provided to avoid writing a metaclass, so the code can run in python 2.6 and 3.x unmodified"
class TaskBase(evil):
class Task(evil):
"""
Base class for all Waf tasks, which should be seen as an interface.
For illustration purposes, instances of this class will execute the attribute
'fun' in :py:meth:`waflib.Task.TaskBase.run`. When in doubt, create
subclasses of :py:class:`waflib.Task.Task` instead.
Subclasses must override these methods:
#. __str__: string to display to the user
#. runnable_status: ask the task if it should be run, skipped, or if we have to ask later
#. run: what to do to execute the task
#. post_run: what to do after the task has been executed
This class deals with the filesystem (:py:class:`waflib.Node.Node`). The method :py:class:`waflib.Task.Task.runnable_status`
uses a hash value (from :py:class:`waflib.Task.Task.signature`) which is persistent from build to build. When the value changes,
the task has to be executed. The method :py:class:`waflib.Task.Task.post_run` will assign the task signature to the output
nodes (if present).
"""
vars = []
"""ConfigSet variables that should trigger a rebuild (class attribute used for :py:meth:`waflib.Task.Task.sig_vars`)"""
always_run = False
"""Specify whether task instances must always be executed or not (class attribute)"""
shell = False
"""Execute the command with the shell (class attribute)"""
color = 'GREEN'
"""Color for the console display, see :py:const:`waflib.Logs.colors_lst`"""
@ -138,7 +144,7 @@ class TaskBase(evil):
after = []
"""List of task class names to execute after instances of this class"""
hcode = ''
hcode = Utils.SIG_NIL
"""String representing an additional hash for the class representation"""
keep_last_cmd = False
@ -146,32 +152,51 @@ class TaskBase(evil):
This may be useful for certain extensions but it can a lot of memory.
"""
__slots__ = ('hasrun', 'generator')
weight = 0
"""Optional weight to tune the priority for task instances.
The higher, the earlier. The weight only applies to single task objects."""
tree_weight = 0
"""Optional weight to tune the priority of task instances and whole subtrees.
The higher, the earlier."""
prio_order = 0
"""Priority order set by the scheduler on instances during the build phase.
You most likely do not need to set it.
"""
__slots__ = ('hasrun', 'generator', 'env', 'inputs', 'outputs', 'dep_nodes', 'run_after')
def __init__(self, *k, **kw):
"""
The base task class requires a task generator (set to *self* if missing)
"""
self.hasrun = NOT_RUN
try:
self.generator = kw['generator']
except KeyError:
self.generator = self
def __repr__(self):
return '\n\t{task %r: %s %s}' % (self.__class__.__name__, id(self), str(getattr(self, 'fun', '')))
self.env = kw['env']
""":py:class:`waflib.ConfigSet.ConfigSet` object (make sure to provide one)"""
def __str__(self):
"String to display to the user"
if hasattr(self, 'fun'):
return self.fun.__name__
return self.__class__.__name__
self.inputs = []
"""List of input nodes, which represent the files used by the task instance"""
def keyword(self):
"Display keyword used to prettify the console outputs"
if hasattr(self, 'fun'):
return 'Function'
return 'Processing'
self.outputs = []
"""List of output nodes, which represent the files created by the task instance"""
self.dep_nodes = []
"""List of additional nodes to depend on"""
self.run_after = set()
"""Set of tasks that must be executed before this one"""
def __lt__(self, other):
return self.priority() > other.priority()
def __le__(self, other):
return self.priority() >= other.priority()
def __gt__(self, other):
return self.priority() < other.priority()
def __ge__(self, other):
return self.priority() <= other.priority()
def get_cwd(self):
"""
@ -205,6 +230,15 @@ class TaskBase(evil):
x = '"%s"' % x
return x
def priority(self):
"""
Priority of execution; the higher, the earlier
:return: the priority value
:rtype: a tuple of numeric values
"""
return (self.weight + self.prio_order, - getattr(self.generator, 'tg_idx_count', 0))
def split_argfile(self, cmd):
"""
Splits a list of process commands into the executable part and its list of arguments
@ -256,36 +290,16 @@ class TaskBase(evil):
else:
return self.generator.bld.exec_command(cmd, **kw)
def runnable_status(self):
"""
Returns the Task status
:return: a task state in :py:const:`waflib.Task.RUN_ME`, :py:const:`waflib.Task.SKIP_ME` or :py:const:`waflib.Task.ASK_LATER`.
:rtype: int
"""
return RUN_ME
def uid(self):
"""
Computes a unique identifier for the task
:rtype: string or bytes
"""
return Utils.SIG_NIL
def process(self):
"""
Assume that the task has had a ``master`` which is an instance of :py:class:`waflib.Runner.Parallel`.
Execute the task and then put it back in the queue :py:attr:`waflib.Runner.Parallel.out` (may be replaced by subclassing).
Runs the task and handles errors
:return: 0 or None if everything is fine
:rtype: integer
"""
# remove the task signature immediately before it is executed
# in case of failure the task will be executed again
m = self.generator.bld.producer
# so that the task will be executed again in case of failure
try:
# TODO another place for this?
del self.generator.bld.task_sigs[self.uid()]
except KeyError:
pass
@ -293,44 +307,29 @@ class TaskBase(evil):
try:
ret = self.run()
except Exception:
self.err_msg = Utils.ex_stack()
self.err_msg = traceback.format_exc()
self.hasrun = EXCEPTION
# TODO cleanup
m.error_handler(self)
return
if ret:
self.err_code = ret
self.hasrun = CRASHED
else:
try:
self.post_run()
except Errors.WafError:
pass
except Exception:
self.err_msg = Utils.ex_stack()
self.hasrun = EXCEPTION
if ret:
self.err_code = ret
self.hasrun = CRASHED
else:
self.hasrun = SUCCESS
if self.hasrun != SUCCESS:
m.error_handler(self)
try:
self.post_run()
except Errors.WafError:
pass
except Exception:
self.err_msg = traceback.format_exc()
self.hasrun = EXCEPTION
else:
self.hasrun = SUCCESS
def run(self):
"""
Called by threads to execute the tasks. The default is empty and meant to be overridden in subclasses.
.. warning:: It is a bad idea to create nodes in this method, so avoid :py:meth:`waflib.Node.Node.ant_glob`
:rtype: int
"""
if hasattr(self, 'fun'):
return self.fun(self)
return 0
def post_run(self):
"Update build data after successful Task execution. Override in subclasses."
pass
if self.hasrun != SUCCESS and self.scan:
# rescan dependencies on next run
try:
del self.generator.bld.imp_sigs[self.uid()]
except KeyError:
pass
def log_display(self, bld):
"Writes the execution status on the context logger"
@ -402,9 +401,7 @@ class TaskBase(evil):
:return: a hash value
:rtype: string
"""
cls = self.__class__
tup = (str(cls.before), str(cls.after), str(cls.ext_in), str(cls.ext_out), cls.__name__, cls.hcode)
return hash(tup)
return (tuple(self.before), tuple(self.after), tuple(self.ext_in), tuple(self.ext_out), self.__class__.__name__, self.hcode)
def format_error(self):
"""
@ -428,6 +425,8 @@ class TaskBase(evil):
return ' -> task in %r failed%s' % (name, msg)
elif self.hasrun == MISSING:
return ' -> missing files in %r%s' % (name, msg)
elif self.hasrun == CANCELED:
return ' -> %r canceled because of missing dependencies' % name
else:
return 'invalid status for task in %r: %r' % (name, self.hasrun)
@ -464,40 +463,6 @@ class TaskBase(evil):
lst.append(y)
return lst
class Task(TaskBase):
"""
This class deals with the filesystem (:py:class:`waflib.Node.Node`). The method :py:class:`waflib.Task.Task.runnable_status`
uses a hash value (from :py:class:`waflib.Task.Task.signature`) which is persistent from build to build. When the value changes,
the task has to be executed. The method :py:class:`waflib.Task.Task.post_run` will assign the task signature to the output
nodes (if present).
"""
vars = []
"""ConfigSet variables that should trigger a rebuild (class attribute used for :py:meth:`waflib.Task.Task.sig_vars`)"""
always_run = False
"""Specify whether task instances must always be executed or not (class attribute)"""
shell = False
"""Execute the command with the shell (class attribute)"""
def __init__(self, *k, **kw):
TaskBase.__init__(self, *k, **kw)
self.env = kw['env']
""":py:class:`waflib.ConfigSet.ConfigSet` object (make sure to provide one)"""
self.inputs = []
"""List of input nodes, which represent the files used by the task instance"""
self.outputs = []
"""List of output nodes, which represent the files created by the task instance"""
self.dep_nodes = []
"""List of additional nodes to depend on"""
self.run_after = set()
"""Set of tasks that must be executed before this one"""
def __str__(self):
"string to display to the user"
name = self.__class__.__name__
@ -520,9 +485,7 @@ class Task(TaskBase):
return '%s: %s%s%s' % (self.__class__.__name__, src_str, sep, tgt_str)
def keyword(self):
"""
See :py:meth:`waflib.Task.TaskBase`
"""
"Display keyword used to prettify the console outputs"
name = self.__class__.__name__
if name.endswith(('lib', 'program')):
return 'Linking'
@ -603,7 +566,7 @@ class Task(TaskBase):
:param task: task
:type task: :py:class:`waflib.Task.Task`
"""
assert isinstance(task, TaskBase)
assert isinstance(task, Task)
self.run_after.add(task)
def signature(self):
@ -652,13 +615,22 @@ class Task(TaskBase):
def runnable_status(self):
"""
See :py:meth:`waflib.Task.TaskBase.runnable_status`
Returns the Task status
:return: a task state in :py:const:`waflib.Task.RUN_ME`,
:py:const:`waflib.Task.SKIP_ME`, :py:const:`waflib.Task.CANCEL_ME` or :py:const:`waflib.Task.ASK_LATER`.
:rtype: int
"""
#return 0 # benchmarking
bld = self.generator.bld
if bld.is_install < 0:
return SKIP_ME
for t in self.run_after:
if not t.hasrun:
return ASK_LATER
elif t.hasrun < SKIPPED:
# a dependency has an error
return CANCEL_ME
# first compute the signature
try:
@ -667,7 +639,6 @@ class Task(TaskBase):
return ASK_LATER
# compare the signature to a signature computed previously
bld = self.generator.bld
key = self.uid()
try:
prev_sig = bld.task_sigs[key]
@ -872,10 +843,10 @@ if sys.hexversion > 0x3000000:
try:
return self.uid_
except AttributeError:
m = Utils.md5(self.__class__.__name__.encode('iso8859-1', 'xmlcharrefreplace'))
m = Utils.md5(self.__class__.__name__.encode('latin-1', 'xmlcharrefreplace'))
up = m.update
for x in self.inputs + self.outputs:
up(x.abspath().encode('iso8859-1', 'xmlcharrefreplace'))
up(x.abspath().encode('latin-1', 'xmlcharrefreplace'))
self.uid_ = m.digest()
return self.uid_
uid.__doc__ = Task.uid.__doc__
@ -892,9 +863,9 @@ def is_before(t1, t2):
waflib.Task.is_before(t1, t2) # True
:param t1: Task object
:type t1: :py:class:`waflib.Task.TaskBase`
:type t1: :py:class:`waflib.Task.Task`
:param t2: Task object
:type t2: :py:class:`waflib.Task.TaskBase`
:type t2: :py:class:`waflib.Task.Task`
"""
to_list = Utils.to_list
for k in to_list(t2.ext_in):
@ -914,27 +885,50 @@ def set_file_constraints(tasks):
Updates the ``run_after`` attribute of all tasks based on the task inputs and outputs
:param tasks: tasks
:type tasks: list of :py:class:`waflib.Task.TaskBase`
:type tasks: list of :py:class:`waflib.Task.Task`
"""
ins = Utils.defaultdict(set)
outs = Utils.defaultdict(set)
for x in tasks:
for a in getattr(x, 'inputs', []) + getattr(x, 'dep_nodes', []):
ins[id(a)].add(x)
for a in getattr(x, 'outputs', []):
outs[id(a)].add(x)
for a in x.inputs:
ins[a].add(x)
for a in x.dep_nodes:
ins[a].add(x)
for a in x.outputs:
outs[a].add(x)
links = set(ins.keys()).intersection(outs.keys())
for k in links:
for a in ins[k]:
a.run_after.update(outs[k])
class TaskGroup(object):
"""
Wrap nxm task order constraints into a single object
to prevent the creation of large list/set objects
This is an optimization
"""
def __init__(self, prev, next):
self.prev = prev
self.next = next
self.done = False
def get_hasrun(self):
for k in self.prev:
if not k.hasrun:
return NOT_RUN
return SUCCESS
hasrun = property(get_hasrun, None)
def set_precedence_constraints(tasks):
"""
Updates the ``run_after`` attribute of all tasks based on the after/before/ext_out/ext_in attributes
:param tasks: tasks
:type tasks: list of :py:class:`waflib.Task.TaskBase`
:type tasks: list of :py:class:`waflib.Task.Task`
"""
cstr_groups = Utils.defaultdict(list)
for x in tasks:
@ -960,9 +954,16 @@ def set_precedence_constraints(tasks):
else:
continue
aval = set(cstr_groups[keys[a]])
for x in cstr_groups[keys[b]]:
x.run_after.update(aval)
a = cstr_groups[keys[a]]
b = cstr_groups[keys[b]]
if len(a) < 2 or len(b) < 2:
for x in b:
x.run_after.update(a)
else:
group = TaskGroup(set(a), set(b))
for x in b:
x.run_after.add(group)
def funex(c):
"""
@ -1145,7 +1146,7 @@ def compile_fun(line, shell=False):
"""
Parses a string expression such as '${CC} ${SRC} -o ${TGT}' and returns a pair containing:
* The function created (compiled) for use as :py:meth:`waflib.Task.TaskBase.run`
* The function created (compiled) for use as :py:meth:`waflib.Task.Task.run`
* The list of variables that must cause rebuilds when *env* data is modified
for example::
@ -1217,7 +1218,6 @@ def task_factory(name, func=None, vars=None, color='GREEN', ext_in=[], ext_out=[
params['run'] = func
cls = type(Task)(name, (Task,), params)
global classes
classes[name] = cls
if ext_in:
@ -1231,22 +1231,6 @@ def task_factory(name, func=None, vars=None, color='GREEN', ext_in=[], ext_out=[
return cls
def always_run(cls):
"""
Deprecated Task class decorator (to be removed in waf 2.0)
Set all task instances of this class to be executed whenever a build is started
The task signature is calculated, but the result of the comparison between
task signatures is bypassed
"""
Logs.warn('This decorator is deprecated, set always_run on the task class instead!')
cls.always_run = True
return cls
def update_outputs(cls):
"""
Obsolete, to be removed in waf 2.0
"""
return cls
TaskBase = Task
"Provided for compatibility reasons, TaskBase should not be used"

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Task generators
@ -20,7 +20,7 @@ HEADER_EXTS = ['.h', '.hpp', '.hxx', '.hh']
class task_gen(object):
"""
Instances of this class create :py:class:`waflib.Task.TaskBase` when
Instances of this class create :py:class:`waflib.Task.Task` when
calling the method :py:meth:`waflib.TaskGen.task_gen.post` from the main thread.
A few notes:
@ -34,7 +34,7 @@ class task_gen(object):
mappings = Utils.ordered_iter_dict()
"""Mappings are global file extension mappings that are retrieved in the order of definition"""
prec = Utils.defaultdict(list)
prec = Utils.defaultdict(set)
"""Dict that holds the precedence execution rules for task generator methods"""
def __init__(self, *k, **kw):
@ -48,7 +48,7 @@ class task_gen(object):
The extra key/value elements passed in ``kw`` are set as attributes
"""
self.source = ''
self.source = []
self.target = ''
self.meths = []
@ -76,12 +76,20 @@ class task_gen(object):
self.env = self.bld.env.derive()
self.path = self.bld.path # emulate chdir when reading scripts
# provide a unique id
# Provide a unique index per folder
# This is part of a measure to prevent output file name collisions
path = self.path.abspath()
try:
self.idx = self.bld.idx[self.path] = self.bld.idx.get(self.path, 0) + 1
self.idx = self.bld.idx[path] = self.bld.idx.get(path, 0) + 1
except AttributeError:
self.bld.idx = {}
self.idx = self.bld.idx[self.path] = 1
self.idx = self.bld.idx[path] = 1
# Record the global task generator count
try:
self.tg_idx_count = self.bld.tg_idx_count = self.bld.tg_idx_count + 1
except AttributeError:
self.tg_idx_count = self.bld.tg_idx_count = 1
for key, val in kw.items():
setattr(self, key, val)
@ -191,7 +199,7 @@ class task_gen(object):
else:
tmp.append(a)
tmp.sort()
tmp.sort(reverse=True)
# topological sort
out = []
@ -211,13 +219,13 @@ class task_gen(object):
break
else:
tmp.append(x)
tmp.sort(reverse=True)
if prec:
buf = ['Cycle detected in the method execution:']
for k, v in prec.items():
buf.append('- %s after %s' % (k, [x for x in v if x in prec]))
raise Errors.WafError('\n'.join(buf))
out.reverse()
self.meths = out
# then we run the methods in order
@ -265,7 +273,7 @@ class task_gen(object):
:param tgt: output nodes
:type tgt: list of :py:class:`waflib.Tools.Node.Node`
:return: A task object
:rtype: :py:class:`waflib.Task.TaskBase`
:rtype: :py:class:`waflib.Task.Task`
"""
task = Task.classes[name](env=self.env.derive(), generator=self)
if src:
@ -431,9 +439,7 @@ def before_method(*k):
def deco(func):
setattr(task_gen, func.__name__, func)
for fun_name in k:
if not func.__name__ in task_gen.prec[fun_name]:
task_gen.prec[fun_name].append(func.__name__)
#task_gen.prec[fun_name].sort()
task_gen.prec[func.__name__].add(fun_name)
return func
return deco
before = before_method
@ -460,9 +466,7 @@ def after_method(*k):
def deco(func):
setattr(task_gen, func.__name__, func)
for fun_name in k:
if not fun_name in task_gen.prec[func.__name__]:
task_gen.prec[func.__name__].append(fun_name)
#task_gen.prec[func.__name__].sort()
task_gen.prec[fun_name].add(func.__name__)
return func
return deco
after = after_method
@ -495,7 +499,8 @@ def extension(*k):
@taskgen_method
def to_nodes(self, lst, path=None):
"""
Converts the input list into a list of nodes.
Flatten the input list of string/nodes/lists into a list of nodes.
It is used by :py:func:`waflib.TaskGen.process_source` and :py:func:`waflib.TaskGen.process_rule`.
It is designed for source files, for folders, see :py:func:`waflib.Tools.ccroot.to_incnodes`:
@ -512,14 +517,16 @@ def to_nodes(self, lst, path=None):
if isinstance(lst, Node.Node):
lst = [lst]
# either a list or a string, convert to a list of nodes
for x in Utils.to_list(lst):
if isinstance(x, str):
node = find(x)
else:
elif hasattr(x, 'name'):
node = x
else:
tmp.extend(self.to_nodes(x))
continue
if not node:
raise Errors.WafError("source not found: %r in %r" % (x, self))
raise Errors.WafError('source not found: %r in %r' % (x, self))
tmp.append(node)
return tmp
@ -614,17 +621,15 @@ def process_rule(self):
return [nodes, []]
cls.scan = scan
# TODO use these values in the cache key if provided
# (may cause excessive caching)
for x in ('after', 'before', 'ext_in', 'ext_out'):
setattr(cls, x, getattr(self, x, []))
if use_cache:
cache[key] = cls
# now create one instance
tsk = self.create_task(name)
for x in ('after', 'before', 'ext_in', 'ext_out'):
setattr(tsk, x, getattr(self, x, []))
if getattr(self, 'timeout', None):
tsk.timeout = self.timeout
@ -718,6 +723,8 @@ class subst_pc(Task.Task):
if getattr(self.generator, 'is_copy', None):
for i, x in enumerate(self.outputs):
x.write(self.inputs[i].read('rb'), 'wb')
stat = os.stat(self.inputs[i].abspath()) # Preserve mtime of the copy
os.utime(self.outputs[i].abspath(), (stat.st_atime, stat.st_mtime))
self.force_permissions()
return None
@ -727,11 +734,11 @@ class subst_pc(Task.Task):
self.force_permissions()
return ret
code = self.inputs[0].read(encoding=getattr(self.generator, 'encoding', 'ISO8859-1'))
code = self.inputs[0].read(encoding=getattr(self.generator, 'encoding', 'latin-1'))
if getattr(self.generator, 'subst_fun', None):
code = self.generator.subst_fun(self, code)
if code is not None:
self.outputs[0].write(code, encoding=getattr(self.generator, 'encoding', 'ISO8859-1'))
self.outputs[0].write(code, encoding=getattr(self.generator, 'encoding', 'latin-1'))
self.force_permissions()
return None
@ -746,7 +753,6 @@ class subst_pc(Task.Task):
lst.append(g(1))
return "%%(%s)s" % g(1)
return ''
global re_m4
code = getattr(self.generator, 're_m4', re_m4).sub(repl, code)
try:
@ -762,7 +768,7 @@ class subst_pc(Task.Task):
d[x] = tmp
code = code % d
self.outputs[0].write(code, encoding=getattr(self.generator, 'encoding', 'ISO8859-1'))
self.outputs[0].write(code, encoding=getattr(self.generator, 'encoding', 'latin-1'))
self.generator.bld.raw_deps[self.uid()] = lst
# make sure the signature is updated
@ -866,21 +872,17 @@ def process_subst(self):
if not a:
raise Errors.WafError('could not find %r for %r' % (x, self))
has_constraints = False
tsk = self.create_task('subst', a, b)
for k in ('after', 'before', 'ext_in', 'ext_out'):
val = getattr(self, k, None)
if val:
has_constraints = True
setattr(tsk, k, val)
# paranoid safety measure for the general case foo.in->foo.h with ambiguous dependencies
if not has_constraints:
global HEADER_EXTS
for xt in HEADER_EXTS:
if b.name.endswith(xt):
tsk.before = [k for k in ('c', 'cxx') if k in Task.classes]
break
for xt in HEADER_EXTS:
if b.name.endswith(xt):
tsk.ext_in = tsk.ext_in + ['.h']
break
inst_to = getattr(self, 'install_path', None)
if inst_to:

View File

@ -1,3 +1,3 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
# Ralf Habacker, 2006 (rh)
"""

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2008-2016 (ita)
# Thomas Nagy, 2008-2017 (ita)
"""
Assembly support, used by tools such as gas and nasm

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# John O'Meara, 2006
# Thomas Nagy 2009-2016 (ita)
# Thomas Nagy 2009-2017 (ita)
"""
The **bison** program is a code generator which creates C or C++ files.

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"Base for c programs/libraries"

188
waflib/Tools/c_config.py Executable file → Normal file
View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
C/C++/D configuration helpers
@ -19,32 +19,6 @@ WAF_CONFIG_H = 'config.h'
DEFKEYS = 'define_key'
INCKEYS = 'include_key'
cfg_ver = {
'atleast-version': '>=',
'exact-version': '==',
'max-version': '<=',
}
SNIP_FUNCTION = '''
int main(int argc, char **argv) {
void (*p)();
(void)argc; (void)argv;
p=(void(*)())(%s);
return !p;
}
'''
"""Code template for checking for functions"""
SNIP_TYPE = '''
int main(int argc, char **argv) {
(void)argc; (void)argv;
if ((%(type_name)s *) 0) return 0;
if (sizeof (%(type_name)s)) return 0;
return 1;
}
'''
"""Code template for checking for types"""
SNIP_EMPTY_PROGRAM = '''
int main(int argc, char **argv) {
(void)argc; (void)argv;
@ -52,15 +26,6 @@ int main(int argc, char **argv) {
}
'''
SNIP_FIELD = '''
int main(int argc, char **argv) {
char *off;
(void)argc; (void)argv;
off = (char*) &((%(type_name)s*)0)->%(field_name)s;
return (size_t) off < sizeof(%(type_name)s);
}
'''
MACRO_TO_DESTOS = {
'__linux__' : 'linux',
'__GNU__' : 'gnu', # hurd
@ -201,7 +166,8 @@ def parse_flags(self, line, uselib_store, env=None, force_static=False, posix=No
static = False
elif x.startswith('-Wl') or x in ('-rdynamic', '-pie'):
app('LINKFLAGS', x)
elif x.startswith(('-m', '-f', '-dynamic', '-O')):
elif x.startswith(('-m', '-f', '-dynamic', '-O', '-g')):
# Adding the -W option breaks python builds on Openindiana
app('CFLAGS', x)
app('CXXFLAGS', x)
elif x.startswith('-bundle'):
@ -239,50 +205,37 @@ def validate_cfg(self, kw):
self.find_program('pkg-config', var='PKGCONFIG')
kw['path'] = self.env.PKGCONFIG
# pkg-config version
if 'atleast_pkgconfig_version' in kw:
if not 'msg' in kw:
# verify that exactly one action is requested
s = ('atleast_pkgconfig_version' in kw) + ('modversion' in kw) + ('package' in kw)
if s != 1:
raise ValueError('exactly one of atleast_pkgconfig_version, modversion and package must be set')
if not 'msg' in kw:
if 'atleast_pkgconfig_version' in kw:
kw['msg'] = 'Checking for pkg-config version >= %r' % kw['atleast_pkgconfig_version']
return
elif 'modversion' in kw:
kw['msg'] = 'Checking for %r version' % kw['modversion']
else:
kw['msg'] = 'Checking for %r' %(kw['package'])
if not 'okmsg' in kw:
# let the modversion check set the okmsg to the detected version
if not 'okmsg' in kw and not 'modversion' in kw:
kw['okmsg'] = 'yes'
if not 'errmsg' in kw:
kw['errmsg'] = 'not found'
if 'modversion' in kw:
if not 'msg' in kw:
kw['msg'] = 'Checking for %r version' % kw['modversion']
# pkg-config version
if 'atleast_pkgconfig_version' in kw:
pass
elif 'modversion' in kw:
if not 'uselib_store' in kw:
kw['uselib_store'] = kw['modversion']
if not 'define_name' in kw:
kw['define_name'] = '%s_VERSION' % Utils.quote_define_name(kw['uselib_store'])
return
if not 'package' in kw:
raise ValueError('a package name is required')
if not 'uselib_store' in kw:
kw['uselib_store'] = kw['package'].upper()
if not 'define_name' in kw:
kw['define_name'] = self.have_define(kw['uselib_store'])
if not 'msg' in kw:
kw['msg'] = 'Checking for %r' % (kw['package'] or kw['path'])
for x in cfg_ver:
# Gotcha: only one predicate is allowed at a time
# TODO remove in waf 2.0
y = x.replace('-', '_')
if y in kw:
package = kw['package']
if Logs.verbose:
Logs.warn('Passing %r to conf.check_cfg() is obsolete, pass parameters directly, eg:', y)
Logs.warn(" conf.check_cfg(package='%s', args=['--libs', '--cflags', '%s >= 1.6'])", package, package)
if not 'msg' in kw:
kw['msg'] = 'Checking for %r %s %s' % (package, cfg_ver[x], kw[y])
break
else:
if not 'uselib_store' in kw:
kw['uselib_store'] = Utils.to_list(kw['package'])[0].upper()
if not 'define_name' in kw:
kw['define_name'] = self.have_define(kw['uselib_store'])
@conf
def exec_cfg(self, kw):
@ -331,23 +284,13 @@ def exec_cfg(self, kw):
if 'atleast_pkgconfig_version' in kw:
cmd = path + ['--atleast-pkgconfig-version=%s' % kw['atleast_pkgconfig_version']]
self.cmd_and_log(cmd, env=env)
if not 'okmsg' in kw:
kw['okmsg'] = 'yes'
return
for x in cfg_ver:
# TODO remove in waf 2.0
y = x.replace('-', '_')
if y in kw:
self.cmd_and_log(path + ['--%s=%s' % (x, kw[y]), kw['package']], env=env)
if not 'okmsg' in kw:
kw['okmsg'] = 'yes'
define_it()
break
# single version for a module
if 'modversion' in kw:
version = self.cmd_and_log(path + ['--modversion', kw['modversion']], env=env).strip()
if not 'okmsg' in kw:
kw['okmsg'] = version
self.define(kw['define_name'], version)
return version
@ -377,14 +320,10 @@ def exec_cfg(self, kw):
val = self.cmd_and_log(lst + ['--variable=' + v], env=env).strip()
var = '%s_%s' % (kw['uselib_store'], v)
v_env[var] = val
if not 'okmsg' in kw:
kw['okmsg'] = 'yes'
return
# so we assume the command-line will output flags to be parsed afterwards
ret = self.cmd_and_log(lst, env=env)
if not 'okmsg' in kw:
kw['okmsg'] = 'yes'
define_it()
self.parse_flags(ret, kw['uselib_store'], kw.get('env', self.env), force_static=static, posix=kw.get('posix'))
@ -401,8 +340,6 @@ def check_cfg(self, *k, **kw):
def configure(conf):
conf.load('compiler_c')
conf.check_cfg(package='glib-2.0', args='--libs --cflags')
conf.check_cfg(package='glib-2.0', uselib_store='GLIB', atleast_version='2.10.0',
args='--cflags --libs')
conf.check_cfg(package='pango')
conf.check_cfg(package='pango', uselib_store='MYPANGO', args=['--cflags', '--libs'])
conf.check_cfg(package='pango',
@ -415,11 +352,6 @@ def check_cfg(self, *k, **kw):
conf.check_cfg(package='gtk+-2.0', variables=['includedir', 'prefix'], uselib_store='FOO')
print(conf.env.FOO_includedir)
"""
if k:
lst = k[0].split()
kw['package'] = lst[0]
kw['args'] = ' '.join(lst[1:])
self.validate_cfg(kw)
if 'msg' in kw:
self.start_msg(kw['msg'], **kw)
@ -486,6 +418,9 @@ def validate_c(self, kw):
:param auto_add_header_name: if header_name was set, add the headers in env.INCKEYS so the next tests will include these headers
:type auto_add_header_name: bool
"""
for x in ('type_name', 'field_name', 'function_name'):
if x in kw:
Logs.warn('Invalid argument %r in test' % x)
if not 'build_fun' in kw:
kw['build_fun'] = build_fun
@ -506,7 +441,7 @@ def validate_c(self, kw):
if not 'compile_mode' in kw:
kw['compile_mode'] = 'c'
if 'cxx' in Utils.to_list(kw.get('features',[])) or kw.get('compiler', '') == 'cxx':
if 'cxx' in Utils.to_list(kw.get('features', [])) or kw.get('compiler') == 'cxx':
kw['compile_mode'] = 'cxx'
if not 'type' in kw:
@ -529,50 +464,19 @@ def validate_c(self, kw):
return ''.join(['#include <%s>\n' % x for x in dct])
return ''
#OSX
if 'framework_name' in kw:
# OSX, not sure this is used anywhere
fwkname = kw['framework_name']
if not 'uselib_store' in kw:
kw['uselib_store'] = fwkname.upper()
if not kw.get('no_header', False):
if not 'header_name' in kw:
kw['header_name'] = []
if not kw.get('no_header'):
fwk = '%s/%s.h' % (fwkname, fwkname)
if kw.get('remove_dot_h'):
fwk = fwk[:-2]
kw['header_name'] = Utils.to_list(kw['header_name']) + [fwk]
val = kw.get('header_name', [])
kw['header_name'] = Utils.to_list(val) + [fwk]
kw['msg'] = 'Checking for framework %s' % fwkname
kw['framework'] = fwkname
#kw['frameworkpath'] = set it yourself
if 'function_name' in kw:
fu = kw['function_name']
if not 'msg' in kw:
kw['msg'] = 'Checking for function %s' % fu
kw['code'] = to_header(kw) + SNIP_FUNCTION % fu
if not 'uselib_store' in kw:
kw['uselib_store'] = fu.upper()
if not 'define_name' in kw:
kw['define_name'] = self.have_define(fu)
elif 'type_name' in kw:
tu = kw['type_name']
if not 'header_name' in kw:
kw['header_name'] = 'stdint.h'
if 'field_name' in kw:
field = kw['field_name']
kw['code'] = to_header(kw) + SNIP_FIELD % {'type_name' : tu, 'field_name' : field}
if not 'msg' in kw:
kw['msg'] = 'Checking for field %s in %s' % (field, tu)
if not 'define_name' in kw:
kw['define_name'] = self.have_define((tu + '_' + field).upper())
else:
kw['code'] = to_header(kw) + SNIP_TYPE % {'type_name' : tu}
if not 'msg' in kw:
kw['msg'] = 'Checking for type %s' % tu
if not 'define_name' in kw:
kw['define_name'] = self.have_define(tu.upper())
elif 'header_name' in kw:
if not 'msg' in kw:
@ -635,11 +539,12 @@ def validate_c(self, kw):
kw['code'] = '\n'.join(['#include <%s>' % x for x in self.env[INCKEYS]]) + '\n' + kw['code']
# in case defines lead to very long command-lines
if kw.get('merge_config_header', False) or env.merge_config_header:
if kw.get('merge_config_header') or env.merge_config_header:
kw['code'] = '%s\n\n%s' % (self.get_config_header(), kw['code'])
env.DEFINES = [] # modify the copy
if not kw.get('success'): kw['success'] = None
if not kw.get('success'):
kw['success'] = None
if 'define_name' in kw:
self.undefine(kw['define_name'])
@ -655,7 +560,7 @@ def post_check(self, *k, **kw):
is_success = 0
if kw['execute']:
if kw['success'] is not None:
if kw.get('define_ret', False):
if kw.get('define_ret'):
is_success = kw['success']
else:
is_success = (kw['success'] == 0)
@ -663,7 +568,6 @@ def post_check(self, *k, **kw):
is_success = (kw['success'] == 0)
if kw.get('define_name'):
# TODO this is still way too complicated
comment = kw.get('comment', '')
define_name = kw['define_name']
if kw['execute'] and kw.get('define_ret') and isinstance(is_success, str):
@ -694,7 +598,7 @@ def post_check(self, *k, **kw):
self.env[define_name] = int(is_success)
if 'header_name' in kw:
if kw.get('auto_add_header_name', False):
if kw.get('auto_add_header_name'):
self.env.append_value(INCKEYS, Utils.to_list(kw['header_name']))
if is_success and 'uselib_store' in kw:
@ -1108,7 +1012,7 @@ def get_cc_version(conf, cc, gcc=False, icc=False, clang=False):
env = conf.env.env or None
try:
out, err = conf.cmd_and_log(cmd, output=0, input='\n'.encode(), env=env)
except Exception:
except Errors.WafError:
conf.fatal('Could not determine the compiler version %r' % cmd)
if gcc:
@ -1251,14 +1155,14 @@ def add_as_needed(self):
# ============ parallel configuration
class cfgtask(Task.TaskBase):
class cfgtask(Task.Task):
"""
A task that executes build configuration tests (calls conf.check)
Make sure to use locks if concurrent access to the same conf.env data is necessary.
"""
def __init__(self, *k, **kw):
Task.TaskBase.__init__(self, *k, **kw)
Task.Task.__init__(self, *k, **kw)
self.run_after = set()
def display(self):
@ -1273,6 +1177,9 @@ class cfgtask(Task.TaskBase):
def uid(self):
return Utils.SIG_NIL
def signature(self):
return Utils.SIG_NIL
def run(self):
conf = self.conf
bld = Build.BuildContext(top_dir=conf.srcnode.abspath(), out_dir=conf.bldnode.abspath())
@ -1300,7 +1207,7 @@ class cfgtask(Task.TaskBase):
return 1
def process(self):
Task.TaskBase.process(self)
Task.Task.process(self)
if 'msg' in self.args:
with self.generator.bld.multicheck_lock:
self.conf.start_msg(self.args['msg'])
@ -1356,11 +1263,12 @@ def multicheck(self, *k, **kw):
bld = par()
bld.keep = kw.get('run_all_tests', True)
bld.imp_sigs = {}
tasks = []
id_to_task = {}
for dct in k:
x = Task.classes['cfgtask'](bld=bld)
x = Task.classes['cfgtask'](bld=bld, env=None)
tasks.append(x)
x.args = dct
x.bld = bld

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy 2008-2016 (ita)
# Thomas Nagy 2008-2017 (ita)
"""
MacOSX related tools

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
C/C++ preprocessor for finding dependencies
@ -155,22 +155,6 @@ for x, syms in enumerate(ops):
for u in syms.split():
prec[u] = x
def trimquotes(s):
"""
Remove the single quotes around an expression::
trimquotes("'test'") == "test"
:param s: expression to transform
:type s: string
:rtype: string
"""
# TODO remove in waf 2.0
if not s: return ''
s = s.rstrip()
if s[0] == "'" and s[-1] == "'": return s[1:-1]
return s
def reduce_nums(val_1, val_2, val_op):
"""
Apply arithmetic rules to compute a result
@ -540,7 +524,6 @@ def reduce_tokens(lst, defs, ban=[]):
accu.append((p2, v2))
accu.extend(toks)
elif to_add[j+1][0] == IDENT and to_add[j+1][1] == '__VA_ARGS__':
# TODO not sure
# first collect the tokens
va_toks = []
st = len(macro_def[0])
@ -763,17 +746,14 @@ def tokenize_private(s):
v = m(name)
if v:
if name == IDENT:
try:
g_optrans[v]
if v in g_optrans:
name = OP
except KeyError:
# c++ specific
if v.lower() == "true":
v = 1
name = NUM
elif v.lower() == "false":
v = 0
name = NUM
elif v.lower() == "true":
v = 1
name = NUM
elif v.lower() == "false":
v = 0
name = NUM
elif name == NUM:
if m('oct'):
v = int(v, 8)
@ -847,6 +827,9 @@ class c_parser(object):
self.ban_includes = set()
"""Includes that must not be read (#pragma once)"""
self.listed = set()
"""Include nodes/names already listed to avoid duplicates in self.nodes/self.names"""
def cached_find_resource(self, node, filename):
"""
Find a file from the input directory
@ -861,7 +844,6 @@ class c_parser(object):
try:
cache = node.ctx.preproc_cache_node
except AttributeError:
global FILE_CACHE_SIZE
cache = node.ctx.preproc_cache_node = Utils.lru_cache(FILE_CACHE_SIZE)
key = (node, filename)
@ -893,7 +875,7 @@ class c_parser(object):
"""
if filename.endswith('.moc'):
# we could let the qt4 module use a subclass, but then the function "scan" below must be duplicated
# in the qt4 and in the qt5 classes. So we have two lines here and it is sufficient. TODO waf 1.9
# in the qt4 and in the qt5 classes. So we have two lines here and it is sufficient.
self.names.append(filename)
return None
@ -907,12 +889,15 @@ class c_parser(object):
break
found = self.cached_find_resource(n, filename)
listed = self.listed
if found and not found in self.ban_includes:
# TODO duplicates do not increase the no-op build times too much, but they may be worth removing
self.nodes.append(found)
if found not in listed:
listed.add(found)
self.nodes.append(found)
self.addlines(found)
else:
if not filename in self.names:
if filename not in listed:
listed.add(filename)
self.names.append(filename)
return found
@ -937,7 +922,6 @@ class c_parser(object):
try:
cache = node.ctx.preproc_cache_lines
except AttributeError:
global LINE_CACHE_SIZE
cache = node.ctx.preproc_cache_lines = Utils.lru_cache(LINE_CACHE_SIZE)
try:
return cache[node]
@ -970,8 +954,7 @@ class c_parser(object):
raise PreprocError('could not read the file %r' % node)
except Exception:
if Logs.verbose > 0:
Logs.error('parsing %r failed', node)
traceback.print_exc()
Logs.error('parsing %r failed %s', node, traceback.format_exc())
else:
self.lines.extend(lines)
@ -1067,7 +1050,7 @@ class c_parser(object):
self.ban_includes.add(self.current_file)
except Exception as e:
if Logs.verbose:
Logs.debug('preproc: line parsing failed (%s): %s %s', e, line, Utils.ex_stack())
Logs.debug('preproc: line parsing failed (%s): %s %s', e, line, traceback.format_exc())
def define_name(self, line):
"""
@ -1086,9 +1069,6 @@ def scan(task):
This function is bound as a task method on :py:class:`waflib.Tools.c.c` and :py:class:`waflib.Tools.cxx.cxx` for example
"""
global go_absolute
try:
incn = task.generator.includes_nodes
except AttributeError:

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2016 (ita)
# Thomas Nagy, 2016-2017 (ita)
"""
Various configuration tests.
@ -199,7 +199,7 @@ class grep_for_endianness(Task.Task):
"""
color = 'PINK'
def run(self):
txt = self.inputs[0].read(flags='rb').decode('iso8859-1')
txt = self.inputs[0].read(flags='rb').decode('latin-1')
if txt.find('LiTTleEnDian') > -1:
self.generator.tmp.append('little')
elif txt.find('BIGenDianSyS') > -1:

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"""
Classes and methods shared by tools providing support for C-like language such
@ -131,6 +131,9 @@ class link_task(Task.Task):
"""
color = 'YELLOW'
weight = 3
"""Try to process link tasks as early as possible"""
inst_to = None
"""Default installation path for the link task outputs, or None to disable"""
@ -270,7 +273,7 @@ def apply_link(self):
try:
inst_to = self.install_path
except AttributeError:
inst_to = self.link_task.__class__.inst_to
inst_to = self.link_task.inst_to
if inst_to:
# install a copy of the node list we have at this moment (implib not added)
self.install_task = self.add_install_files(
@ -615,7 +618,7 @@ def apply_vnum(self):
try:
inst_to = self.install_path
except AttributeError:
inst_to = self.link_task.__class__.inst_to
inst_to = self.link_task.inst_to
if inst_to:
p = Utils.subst_vars(inst_to, self.env)
path = os.path.join(p, name2)

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy 2009-2016 (ita)
# Thomas Nagy 2009-2017 (ita)
"""
Detect the Clang++ C++ compiler

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# Carlos Rafael Giani, 2007 (dv)
# Thomas Nagy, 2016 (ita)
# Thomas Nagy, 2016-2017 (ita)
"""
Try to detect a D compiler from the list of supported compilers::

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
C# support. A simple example::

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2005-2016 (ita)
# Thomas Nagy, 2005-2017 (ita)
"Base for c++ programs and libraries"

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# Carlos Rafael Giani, 2007 (dv)
# Thomas Nagy, 2007-2016 (ita)
# Thomas Nagy, 2007-2017 (ita)
from waflib import Utils, Task, Errors
from waflib.TaskGen import taskgen_method, feature, extension

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2016 (ita)
# Thomas Nagy, 2016-2017 (ita)
from waflib import Utils
from waflib.Configure import conf

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2016 (ita)
# Thomas Nagy, 2016-2017 (ita)
"""
Provide a scanner for finding dependencies on d files

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# Carlos Rafael Giani, 2007 (dv)
# Thomas Nagy, 2008-2016 (ita)
# Thomas Nagy, 2008-2017 (ita)
import sys
from waflib.Tools import ar, d

View File

@ -136,7 +136,7 @@ def enhance_lib():
Logs.error("In ant_glob pattern %r: '.' means 'one dot', not 'current directory'", k[0])
if kw.get('remove', True):
try:
if self.is_child_of(self.ctx.bldnode) and not kw.get('quiet', False):
if self.is_child_of(self.ctx.bldnode) and not kw.get('quiet'):
Logs.error('Using ant_glob on the build folder (%r) is dangerous (quiet=True to disable this warning)', self)
except AttributeError:
pass
@ -173,7 +173,7 @@ def enhance_lib():
else:
for x in ('before', 'after'):
for y in self.to_list(getattr(self, x, [])):
if not Task.classes.get(y, None):
if not Task.classes.get(y):
Logs.error('Erroneous order constraint %s=%r on %r (no such class)', x, y, self)
TaskGen.feature('*')(check_err_order)
@ -215,7 +215,7 @@ def enhance_lib():
elif name == 'prepend':
raise Errors.WafError('env.prepend does not exist: use env.prepend_value')
if name in self.__slots__:
return object.__getattr__(self, name, default)
return super(ConfigSet.ConfigSet, self).__getattr__(name, default)
else:
return self[name]
ConfigSet.ConfigSet.__getattr__ = _getattr

View File

@ -1,13 +1,13 @@
#! /usr/bin/env python
# encoding: utf-8
# DC 2008
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
"""
Fortran support
"""
from waflib import Utils, Task
from waflib import Utils, Task, Errors
from waflib.Tools import ccroot, fc_config, fc_scan
from waflib.TaskGen import extension
from waflib.Configure import conf
@ -46,7 +46,7 @@ def get_fortran_tasks(tsk):
class fc(Task.Task):
"""
Fortran tasks can only run when all fortran tasks in the current group are ready to be executed
Fortran tasks can only run when all fortran tasks in a current task group are ready to be executed
This may cause a deadlock if some fortran task is waiting for something that cannot happen (circular dependency)
Should this ever happen, set the 'nomod=True' on those tasks instances to break the loop
"""
@ -85,12 +85,11 @@ class fc(Task.Task):
ret = tsk.runnable_status()
if ret == Task.ASK_LATER:
# we have to wait for one of the other fortran tasks to be ready
# this may deadlock if there are dependencies between the fortran tasks
# this may deadlock if there are dependencies between fortran tasks
# but this should not happen (we are setting them here!)
for x in lst:
x.mod_fortran_done = None
# TODO sort the list of tasks in bld.producer.outstanding to put all fortran tasks at the end
return Task.ASK_LATER
ins = Utils.defaultdict(set)
@ -104,7 +103,7 @@ class fc(Task.Task):
name = bld.modfile(x.replace('MOD@', ''))
node = bld.srcnode.find_or_declare(name)
tsk.set_outputs(node)
outs[id(node)].add(tsk)
outs[node].add(tsk)
# the .mod files to use
for tsk in lst:
@ -116,7 +115,7 @@ class fc(Task.Task):
if node and node not in tsk.outputs:
if not node in bld.node_deps[key]:
bld.node_deps[key].append(node)
ins[id(node)].add(tsk)
ins[node].add(tsk)
# if the intersection matches, set the order
for k in ins.keys():
@ -178,7 +177,7 @@ class fcprogram_test(fcprogram):
kw['output'] = 0
try:
(bld.out, bld.err) = bld.cmd_and_log(cmd, **kw)
except Exception:
except Errors.WafError:
return -1
if bld.out:

View File

@ -1,7 +1,7 @@
#! /usr/bin/env python
# encoding: utf-8
# DC 2008
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
"""
Fortran configuration helpers
@ -117,7 +117,7 @@ def fortran_modifier_win32(conf):
v.fcprogram_PATTERN = v.fcprogram_test_PATTERN = '%s.exe'
v.fcshlib_PATTERN = '%s.dll'
v.implib_PATTERN = 'lib%s.dll.a'
v.implib_PATTERN = '%s.dll.a'
v.IMPLIB_ST = '-Wl,--out-implib,%s'
v.FCFLAGS_fcshlib = []
@ -456,7 +456,7 @@ def detect_openmp(self):
"""
Detects openmp flags and sets the OPENMP ``FCFLAGS``/``LINKFLAGS``
"""
for x in ('-qopenmp', '-fopenmp','-openmp','-mp','-xopenmp','-omp','-qsmp=omp'):
for x in ('-fopenmp','-openmp','-mp','-xopenmp','-omp','-qsmp=omp'):
try:
self.check_fc(
msg = 'Checking for OpenMP flag %s' % x,

View File

@ -1,7 +1,7 @@
#! /usr/bin/env python
# encoding: utf-8
# DC 2008
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
import re

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# John O'Meara, 2006
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
The **flex** program is a code generator which creates C or C++ files.

View File

@ -1,7 +1,7 @@
#! /usr/bin/env python
# encoding: utf-8
# KWS 2010
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
import re
from waflib import Utils

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2008-2016 (ita)
# Thomas Nagy, 2008-2017 (ita)
"Detect as/gas/gcc for compiling assembly files"

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
# Ralf Habacker, 2006 (rh)
# Yinon Ehrlich, 2009
@ -68,7 +68,7 @@ def gcc_modifier_win32(conf):
v.cprogram_PATTERN = '%s.exe'
v.cshlib_PATTERN = '%s.dll'
v.implib_PATTERN = 'lib%s.dll.a'
v.implib_PATTERN = '%s.dll.a'
v.IMPLIB_ST = '-Wl,--out-implib,%s'
v.CFLAGS_cshlib = []

View File

@ -1,7 +1,7 @@
#! /usr/bin/env python
# encoding: utf-8
# DC 2008
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
import re
from waflib import Utils

View File

@ -1,6 +1,6 @@
#! /usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
Support for GLib2 tools:
@ -452,7 +452,6 @@ def find_glib_compile_schemas(conf):
def getstr(varname):
return getattr(Options.options, varname, getattr(conf.env,varname, ''))
# TODO make this dependent on the gnu_dirs tool?
gsettingsschemadir = getstr('GSETTINGSSCHEMADIR')
if not gsettingsschemadir:
datadir = getstr('DATADIR')

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
# Ralf Habacker, 2006 (rh)
# Yinon Ehrlich, 2009
@ -68,7 +68,7 @@ def gxx_modifier_win32(conf):
v.cxxprogram_PATTERN = '%s.exe'
v.cxxshlib_PATTERN = '%s.dll'
v.implib_PATTERN = 'lib%s.dll.a'
v.implib_PATTERN = '%s.dll.a'
v.IMPLIB_ST = '-Wl,--out-implib,%s'
v.CXXFLAGS_cxxshlib = []

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# Stian Selnes 2008
# Thomas Nagy 2009-2016 (ita)
# Thomas Nagy 2009-2017 (ita)
"""
Detects the Intel C compiler

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy 2009-2016 (ita)
# Thomas Nagy 2009-2017 (ita)
"""
Detects the Intel C++ compiler

View File

@ -1,9 +1,9 @@
#! /usr/bin/env python
# encoding: utf-8
# DC 2008
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
import os, re
import os, re, traceback
from waflib import Utils, Logs, Errors
from waflib.Tools import fc, fc_config, fc_scan, ar, ccroot
from waflib.Configure import conf
@ -110,16 +110,16 @@ def gather_ifort_versions(conf, versions):
version_pattern = re.compile('^...?.?\....?.?')
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Wow6432node\\Intel\\Compilers\\Fortran')
except WindowsError:
except OSError:
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Intel\\Compilers\\Fortran')
except WindowsError:
except OSError:
return
index = 0
while 1:
try:
version = Utils.winreg.EnumKey(all_versions, index)
except WindowsError:
except OSError:
break
index += 1
if not version_pattern.match(version):
@ -134,7 +134,7 @@ def gather_ifort_versions(conf, versions):
Utils.winreg.OpenKey(all_versions,version+'\\'+targetDir)
icl_version=Utils.winreg.OpenKey(all_versions,version)
path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir')
except WindowsError:
except OSError:
pass
else:
batch_file=os.path.join(path,'bin','iclvars.bat')
@ -145,7 +145,7 @@ def gather_ifort_versions(conf, versions):
try:
icl_version = Utils.winreg.OpenKey(all_versions, version+'\\'+target)
path,type = Utils.winreg.QueryValueEx(icl_version,'ProductDir')
except WindowsError:
except OSError:
continue
else:
batch_file=os.path.join(path,'bin','iclvars.bat')
@ -232,7 +232,7 @@ echo LIB=%%LIB%%;%%LIBPATH%%
try:
conf.cmd_and_log(fc + ['/help'], env=env)
except UnicodeError:
st = Utils.ex_stack()
st = traceback.format_exc()
if conf.logger:
conf.logger.error(st)
conf.fatal('ifort: Unicode error - check the code page?')

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
Support for translation tools such as msgfmt and intltool
@ -27,6 +27,8 @@ Usage::
Usage of the :py:mod:`waflib.Tools.gnu_dirs` is recommended, but not obligatory.
"""
from __future__ import with_statement
import os, re
from waflib import Context, Task, Utils, Logs
import waflib.Tools.ccroot
@ -157,13 +159,12 @@ def apply_intltool_po(self):
linguas = self.path.find_node(os.path.join(podir, 'LINGUAS'))
if linguas:
# scan LINGUAS file for locales to process
file = open(linguas.abspath())
langs = []
for line in file.readlines():
# ignore lines containing comments
if not line.startswith('#'):
langs += line.split()
file.close()
with open(linguas.abspath()) as f:
langs = []
for line in f.readlines():
# ignore lines containing comments
if not line.startswith('#'):
langs += line.split()
re_linguas = re.compile('[-a-zA-Z_@.]+')
for lang in langs:
# Make sure that we only process lines which contain locales

View File

@ -6,6 +6,7 @@
Compiler definition for irix/MIPSpro cc compiler
"""
from waflib import Errors
from waflib.Tools import ccroot, ar
from waflib.Configure import conf
@ -24,7 +25,7 @@ def find_irixcc(conf):
try:
conf.cmd_and_log(cc + ['-version'])
except Exception:
except Errors.WafError:
conf.fatal('%r -version could not be executed' % cc)
v.CC = cc

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
Java support
@ -166,7 +166,7 @@ def jar_files(self):
if manifest:
jarcreate = getattr(self, 'jarcreate', 'cfm')
if not isinstance(manifest,Node.Node):
node = self.path.find_or_declare(manifest)
node = self.path.find_resource(manifest)
else:
node = manifest
tsk.dep_nodes.append(node)
@ -240,7 +240,6 @@ class jar_create(JTask):
if not t.hasrun:
return Task.ASK_LATER
if not self.inputs:
global JAR_RE
try:
self.inputs = [x for x in self.basedir.ant_glob(JAR_RE, remove=False) if id(x) != id(self.outputs[0])]
except Exception:
@ -273,7 +272,6 @@ class javac(JTask):
return Task.ASK_LATER
if not self.inputs:
global SOURCE_RE
self.inputs = []
for x in self.srcdir:
self.inputs.extend(x.ant_glob(SOURCE_RE, remove=False))

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# Sebastian Schlingmann, 2008
# Thomas Nagy, 2008-2016 (ita)
# Thomas Nagy, 2008-2017 (ita)
"""
Lua support.

View File

@ -25,7 +25,6 @@ def h_file(self):
if filename in cache and cache[filename][0] == st.st_mtime:
return cache[filename][1]
global STRONGEST
if STRONGEST:
ret = Utils.h_file(filename)
else:

View File

@ -52,7 +52,7 @@ cmd.exe /C "chcp 1252 & set PYTHONUNBUFFERED=true && set && waf configure"
Setting PYTHONUNBUFFERED gives the unbuffered output.
"""
import os, sys, re
import os, sys, re, traceback
from waflib import Utils, Logs, Options, Errors
from waflib.TaskGen import after_method, feature
@ -213,7 +213,7 @@ echo LIB=%%LIB%%;%%LIBPATH%%
try:
conf.cmd_and_log(cxx + ['/help'], env=env)
except UnicodeError:
st = Utils.ex_stack()
st = traceback.format_exc()
if conf.logger:
conf.logger.error(st)
conf.fatal('msvc: Unicode error - check the code page?')
@ -238,16 +238,16 @@ def gather_wsdk_versions(conf, versions):
version_pattern = re.compile('^v..?.?\...?.?')
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Wow6432node\\Microsoft\\Microsoft SDKs\\Windows')
except WindowsError:
except OSError:
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Microsoft\\Microsoft SDKs\\Windows')
except WindowsError:
except OSError:
return
index = 0
while 1:
try:
version = Utils.winreg.EnumKey(all_versions, index)
except WindowsError:
except OSError:
break
index += 1
if not version_pattern.match(version):
@ -255,7 +255,7 @@ def gather_wsdk_versions(conf, versions):
try:
msvc_version = Utils.winreg.OpenKey(all_versions, version)
path,type = Utils.winreg.QueryValueEx(msvc_version,'InstallationFolder')
except WindowsError:
except OSError:
continue
if path and os.path.isfile(os.path.join(path, 'bin', 'SetEnv.cmd')):
targets = {}
@ -273,10 +273,10 @@ def gather_wince_supported_platforms():
supported_wince_platforms = []
try:
ce_sdk = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Wow6432node\\Microsoft\\Windows CE Tools\\SDKs')
except WindowsError:
except OSError:
try:
ce_sdk = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Microsoft\\Windows CE Tools\\SDKs')
except WindowsError:
except OSError:
ce_sdk = ''
if not ce_sdk:
return supported_wince_platforms
@ -286,15 +286,15 @@ def gather_wince_supported_platforms():
try:
sdk_device = Utils.winreg.EnumKey(ce_sdk, index)
sdk = Utils.winreg.OpenKey(ce_sdk, sdk_device)
except WindowsError:
except OSError:
break
index += 1
try:
path,type = Utils.winreg.QueryValueEx(sdk, 'SDKRootDir')
except WindowsError:
except OSError:
try:
path,type = Utils.winreg.QueryValueEx(sdk,'SDKInformation')
except WindowsError:
except OSError:
continue
path,xml = os.path.split(path)
path = str(path)
@ -317,18 +317,18 @@ def gather_msvc_detected_versions():
prefix = 'SOFTWARE\\Wow6432node\\Microsoft\\' + vcver
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, prefix)
except WindowsError:
except OSError:
prefix = 'SOFTWARE\\Microsoft\\' + vcver
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, prefix)
except WindowsError:
except OSError:
continue
index = 0
while 1:
try:
version = Utils.winreg.EnumKey(all_versions, index)
except WindowsError:
except OSError:
break
index += 1
match = version_pattern.match(version)
@ -477,14 +477,14 @@ def gather_msvc_versions(conf, versions):
try:
try:
msvc_version = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, reg + "\\Setup\\VC")
except WindowsError:
except OSError:
msvc_version = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, reg + "\\Setup\\Microsoft Visual C++")
path,type = Utils.winreg.QueryValueEx(msvc_version, 'ProductDir')
except WindowsError:
except OSError:
try:
msvc_version = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, "SOFTWARE\\Wow6432node\\Microsoft\\VisualStudio\\SxS\\VS7")
path,type = Utils.winreg.QueryValueEx(msvc_version, version)
except WindowsError:
except OSError:
continue
else:
vc_paths.append((version, os.path.abspath(str(path))))
@ -524,16 +524,16 @@ def gather_icl_versions(conf, versions):
version_pattern = re.compile('^...?.?\....?.?')
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Wow6432node\\Intel\\Compilers\\C++')
except WindowsError:
except OSError:
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Intel\\Compilers\\C++')
except WindowsError:
except OSError:
return
index = 0
while 1:
try:
version = Utils.winreg.EnumKey(all_versions, index)
except WindowsError:
except OSError:
break
index += 1
if not version_pattern.match(version):
@ -548,7 +548,7 @@ def gather_icl_versions(conf, versions):
Utils.winreg.OpenKey(all_versions,version+'\\'+targetDir)
icl_version=Utils.winreg.OpenKey(all_versions,version)
path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir')
except WindowsError:
except OSError:
pass
else:
batch_file=os.path.join(path,'bin','iclvars.bat')
@ -558,7 +558,7 @@ def gather_icl_versions(conf, versions):
try:
icl_version = Utils.winreg.OpenKey(all_versions, version+'\\'+target)
path,type = Utils.winreg.QueryValueEx(icl_version,'ProductDir')
except WindowsError:
except OSError:
continue
else:
batch_file=os.path.join(path,'bin','iclvars.bat')
@ -578,16 +578,16 @@ def gather_intel_composer_versions(conf, versions):
version_pattern = re.compile('^...?.?\...?.?.?')
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Wow6432node\\Intel\\Suites')
except WindowsError:
except OSError:
try:
all_versions = Utils.winreg.OpenKey(Utils.winreg.HKEY_LOCAL_MACHINE, 'SOFTWARE\\Intel\\Suites')
except WindowsError:
except OSError:
return
index = 0
while 1:
try:
version = Utils.winreg.EnumKey(all_versions, index)
except WindowsError:
except OSError:
break
index += 1
if not version_pattern.match(version):
@ -601,7 +601,7 @@ def gather_intel_composer_versions(conf, versions):
try:
try:
defaults = Utils.winreg.OpenKey(all_versions,version+'\\Defaults\\C++\\'+targetDir)
except WindowsError:
except OSError:
if targetDir == 'EM64T_NATIVE':
defaults = Utils.winreg.OpenKey(all_versions,version+'\\Defaults\\C++\\EM64T')
else:
@ -610,7 +610,7 @@ def gather_intel_composer_versions(conf, versions):
Utils.winreg.OpenKey(all_versions,version+'\\'+uid+'\\C++\\'+targetDir)
icl_version=Utils.winreg.OpenKey(all_versions,version+'\\'+uid+'\\C++')
path,type=Utils.winreg.QueryValueEx(icl_version,'ProductDir')
except WindowsError:
except OSError:
pass
else:
batch_file=os.path.join(path,'bin','iclvars.bat')
@ -709,7 +709,7 @@ def libname_msvc(self, libname, is_static=False):
(lt_path, lt_libname, lt_static) = self.find_lt_names_msvc(lib, is_static)
if lt_path != None and lt_libname != None:
if lt_static == True:
if lt_static:
# file existence check has been made by find_lt_names
return os.path.join(lt_path,lt_libname)
@ -852,8 +852,7 @@ def find_msvc(conf):
# linker
if not v.LINK_CXX:
# TODO: var=LINK_CXX to let so that LINK_CXX can be overridden?
v.LINK_CXX = conf.find_program(linker_name, path_list=path, errmsg='%s was not found (linker)' % linker_name)
conf.find_program(linker_name, path_list=path, errmsg='%s was not found (linker)' % linker_name, var='LINK_CXX')
if not v.LINK_CC:
v.LINK_CC = v.LINK_CXX

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2008-2016 (ita)
# Thomas Nagy, 2008-2017 (ita)
"""
Nasm tool (asm processing)

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# andersg at 0x63.nu 2007
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
"""
Support for Perl extensions. A C/C++ compiler is required::
@ -24,7 +24,7 @@ Support for Perl extensions. A C/C++ compiler is required::
"""
import os
from waflib import Task, Options, Utils
from waflib import Task, Options, Utils, Errors
from waflib.Configure import conf
from waflib.TaskGen import extension, feature, before_method
@ -99,7 +99,7 @@ def check_perl_module(self, module):
self.start_msg('perl module %s' % module)
try:
r = self.cmd_and_log(cmd)
except Exception:
except Errors.WafError:
self.end_msg(False)
return None
self.end_msg(r or True)

View File

@ -19,7 +19,7 @@ Support for Python, detect the headers and libraries and provide
"""
import os, sys
from waflib import Utils, Options, Errors, Logs, Task, Node
from waflib import Errors, Logs, Node, Options, Task, Utils
from waflib.TaskGen import extension, before_method, after_method, feature
from waflib.Configure import conf
@ -553,7 +553,7 @@ def check_python_module(conf, module_name, condition=''):
conf.start_msg(msg)
try:
ret = conf.cmd_and_log(conf.env.PYTHON + ['-c', PYTHON_MODULE_TEMPLATE % module_name])
except Exception:
except Errors.WafError:
conf.end_msg(False)
conf.fatal('Could not find the python module %r' % module_name)

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
This tool helps with finding Qt5 tools and libraries,
@ -61,6 +61,8 @@ The detection uses pkg-config on Linux by default. To force static library detec
QT5_XCOMPILE=1 QT5_FORCE_STATIC=1 waf configure
"""
from __future__ import with_statement
try:
from xml.sax import make_parser
from xml.sax.handler import ContentHandler
@ -149,7 +151,7 @@ class qxx(Task.classes['cxx']):
# direct injection in the build phase (safe because called from the main thread)
gen = self.generator.bld.producer
gen.outstanding.appendleft(tsk)
gen.outstanding.append(tsk)
gen.total += 1
return tsk
@ -314,7 +316,7 @@ def apply_qt5(self):
if getattr(self, 'update', None) and Options.options.trans_qt5:
cxxnodes = [a.inputs[0] for a in self.compiled_tasks] + [
a.inputs[0] for a in self.tasks if getattr(a, 'inputs', None) and a.inputs[0].name.endswith('.ui')]
a.inputs[0] for a in self.tasks if a.inputs and a.inputs[0].name.endswith('.ui')]
for x in qmtasks:
self.create_task('trans_update', cxxnodes, x.inputs)
@ -366,11 +368,8 @@ class rcc(Task.Task):
parser = make_parser()
curHandler = XMLHandler()
parser.setContentHandler(curHandler)
fi = open(self.inputs[0].abspath(), 'r')
try:
parser.parse(fi)
finally:
fi.close()
with open(self.inputs[0].abspath(), 'r') as f:
parser.parse(f)
nodes = []
names = []
@ -657,16 +656,14 @@ def find_qt5_libraries(self):
self.msg('Checking for %s' % i, False, 'YELLOW')
env.append_unique('INCLUDES_' + uselib, os.path.join(env.QTLIBS, frameworkName, 'Headers'))
else:
for j in ('', 'd'):
k = '_DEBUG' if j == 'd' else ''
ret = self.find_single_qt5_lib(i + j, uselib + k, env.QTLIBS, qtincludes, force_static)
if not force_static and not ret:
ret = self.find_single_qt5_lib(i + j, uselib + k, env.QTLIBS, qtincludes, True)
self.msg('Checking for %s' % (i + j), ret, 'GREEN' if ret else 'YELLOW')
ret = self.find_single_qt5_lib(i, uselib, env.QTLIBS, qtincludes, force_static)
if not force_static and not ret:
ret = self.find_single_qt5_lib(i, uselib, env.QTLIBS, qtincludes, True)
self.msg('Checking for %s' % i, ret, 'GREEN' if ret else 'YELLOW')
else:
path = '%s:%s:%s/pkgconfig:/usr/lib/qt5/lib/pkgconfig:/opt/qt5/lib/pkgconfig:/usr/lib/qt5/lib:/opt/qt5/lib' % (
self.environ.get('PKG_CONFIG_PATH', ''), env.QTLIBS, env.QTLIBS)
for i in self.qt5_vars_debug + self.qt5_vars:
for i in self.qt5_vars:
self.check_cfg(package=i, args='--cflags --libs', mandatory=False, force_static=force_static, pkg_config_path=path)
@conf
@ -692,7 +689,6 @@ def simplify_qt5_libs(self):
accu.append(lib)
env['LIBPATH_'+var] = accu
process_lib(self.qt5_vars, 'LIBPATH_QTCORE')
process_lib(self.qt5_vars_debug, 'LIBPATH_QTCORE_DEBUG')
@conf
def add_qt5_rpath(self):
@ -715,7 +711,6 @@ def add_qt5_rpath(self):
accu.append('-Wl,--rpath='+lib)
env['RPATH_' + var] = accu
process_rpath(self.qt5_vars, 'LIBPATH_QTCORE')
process_rpath(self.qt5_vars_debug, 'LIBPATH_QTCORE_DEBUG')
@conf
def set_qt5_libs_to_check(self):
@ -742,10 +737,6 @@ def set_qt5_libs_to_check(self):
if qtextralibs:
self.qt5_vars.extend(qtextralibs.split(','))
if not hasattr(self, 'qt5_vars_debug'):
self.qt5_vars_debug = [a + '_DEBUG' for a in self.qt5_vars]
self.qt5_vars_debug = Utils.to_list(self.qt5_vars_debug)
@conf
def set_qt5_defines(self):
if sys.platform != 'win32':
@ -753,7 +744,6 @@ def set_qt5_defines(self):
for x in self.qt5_vars:
y=x.replace('Qt5', 'Qt')[2:].upper()
self.env.append_unique('DEFINES_%s' % x.upper(), 'QT_%s_LIB' % y)
self.env.append_unique('DEFINES_%s_DEBUG' % x.upper(), 'QT_%s_LIB' % y)
def options(opt):
"""

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python
# encoding: utf-8
# daniel.svensson at purplescout.se 2008
# Thomas Nagy 2016 (ita)
# Thomas Nagy 2016-2017 (ita)
"""
Support for Ruby extensions. A C/C++ compiler is required::
@ -23,7 +23,7 @@ Support for Ruby extensions. A C/C++ compiler is required::
"""
import os
from waflib import Options, Utils, Task
from waflib import Errors, Options, Task, Utils
from waflib.TaskGen import before_method, feature, extension
from waflib.Configure import conf
@ -60,13 +60,13 @@ def check_ruby_version(self, minver=()):
try:
version = self.cmd_and_log(ruby + ['-e', 'puts defined?(VERSION) ? VERSION : RUBY_VERSION']).strip()
except Exception:
except Errors.WafError:
self.fatal('could not determine ruby version')
self.env.RUBY_VERSION = version
try:
ver = tuple(map(int, version.split(".")))
except Exception:
ver = tuple(map(int, version.split('.')))
except Errors.WafError:
self.fatal('unsupported ruby version %r' % version)
cver = ''
@ -151,7 +151,7 @@ def check_ruby_module(self, module_name):
self.start_msg('Ruby module %s' % module_name)
try:
self.cmd_and_log(self.env.RUBY + ['-e', 'require \'%s\';puts 1' % module_name])
except Exception:
except Errors.WafError:
self.end_msg(False)
self.fatal('Could not find the ruby module %r' % module_name)
self.end_msg(True)

View File

@ -1,8 +1,9 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
# Ralf Habacker, 2006 (rh)
from waflib import Errors
from waflib.Tools import ccroot, ar
from waflib.Configure import conf
@ -15,7 +16,7 @@ def find_scc(conf):
cc = conf.find_program('cc', var='CC')
try:
conf.cmd_and_log(cc + ['-flags'])
except Exception:
except Errors.WafError:
conf.fatal('%r is not a Sun compiler' % cc)
v.CC_NAME = 'sun'
conf.get_suncc_version(cc)

View File

@ -1,8 +1,9 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
# Ralf Habacker, 2006 (rh)
from waflib import Errors
from waflib.Tools import ccroot, ar
from waflib.Configure import conf
@ -15,7 +16,7 @@ def find_sxx(conf):
cc = conf.find_program(['CC', 'c++'], var='CXX')
try:
conf.cmd_and_log(cc + ['-flags'])
except Exception:
except Errors.WafError:
conf.fatal('%r is not a Sun compiler' % cc)
v.CXX_NAME = 'sun'
conf.get_suncc_version(cc)

View File

@ -1,6 +1,6 @@
#!/usr/bin/env python
# encoding: utf-8
# Thomas Nagy, 2006-2016 (ita)
# Thomas Nagy, 2006-2017 (ita)
"""
TeX/LaTeX/PDFLaTeX/XeLaTeX support
@ -167,7 +167,6 @@ class tex(Task.Task):
return
seen.append(node)
code = node.read()
global re_tex
for match in re_tex.finditer(code):
multibib = match.group('type')

View File

@ -9,7 +9,7 @@ this tool to be too stable either (apis, etc)
"""
import re
from waflib import Build, Context, Task, Utils, Logs, Options, Errors, Node
from waflib import Build, Context, Errors, Logs, Node, Options, Task, Utils
from waflib.TaskGen import extension, taskgen_method
from waflib.Configure import conf
@ -150,10 +150,8 @@ def init_vala_task(self):
# but this makes it explicit
package_obj.post()
package_name = package_obj.target
for task in package_obj.tasks:
if isinstance(task, Build.inst):
# TODO are we not expecting just valatask here?
continue
task = getattr(package_obj, 'valatask', None)
if task:
for output in task.outputs:
if output.name == package_name + ".vapi":
valatask.set_run_after(task)
@ -274,7 +272,7 @@ def find_valac(self, valac_name, min_version):
valac = self.find_program(valac_name, var='VALAC')
try:
output = self.cmd_and_log(valac + ['--version'])
except Exception:
except Errors.WafError:
valac_version = None
else:
ver = re.search(r'\d+.\d+.\d+', output).group().split('.')

Some files were not shown because too many files have changed in this diff Show More