Imported Upstream version 6.10.0.49

Former-commit-id: 1d6753294b2993e1fbf92de9366bb9544db4189b
This commit is contained in:
Xamarin Public Jenkins (auto-signing)
2020-01-16 16:38:04 +00:00
parent d94e79959b
commit 468663ddbb
48518 changed files with 2789335 additions and 61176 deletions

View File

@ -0,0 +1,120 @@
scan-build
==========
A package designed to wrap a build so that all calls to gcc/clang are
intercepted and logged into a [compilation database][1] and/or piped to
the clang static analyzer. Includes intercept-build tool, which logs
the build, as well as scan-build tool, which logs the build and runs
the clang static analyzer on it.
Portability
-----------
Should be working on UNIX operating systems.
- It has been tested on FreeBSD, GNU/Linux and OS X.
- Prepared to work on windows, but need help to make it.
Prerequisites
-------------
1. **python** interpreter (version 2.7, 3.2, 3.3, 3.4, 3.5).
How to use
----------
To run the Clang static analyzer against a project goes like this:
$ scan-build <your build command>
To generate a compilation database file goes like this:
$ intercept-build <your build command>
To run the Clang static analyzer against a project with compilation database
goes like this:
$ analyze-build
Use `--help` to know more about the commands.
Limitations
-----------
Generally speaking, the `intercept-build` and `analyze-build` tools together
does the same job as `scan-build` does. So, you can expect the same output
from this line as simple `scan-build` would do:
$ intercept-build <your build command> && analyze-build
The major difference is how and when the analyzer is run. The `scan-build`
tool has three distinct model to run the analyzer:
1. Use compiler wrappers to make actions.
The compiler wrappers does run the real compiler and the analyzer.
This is the default behaviour, can be enforced with `--override-compiler`
flag.
2. Use special library to intercept compiler calls durring the build process.
The analyzer run against each modules after the build finished.
Use `--intercept-first` flag to get this model.
3. Use compiler wrappers to intercept compiler calls durring the build process.
The analyzer run against each modules after the build finished.
Use `--intercept-first` and `--override-compiler` flags together to get
this model.
The 1. and 3. are using compiler wrappers, which works only if the build
process respects the `CC` and `CXX` environment variables. (Some build
process can override these variable as command line parameter only. This case
you need to pass the compiler wrappers manually. eg.: `intercept-build
--override-compiler make CC=intercept-cc CXX=intercept-c++ all` where the
original build command would have been `make all` only.)
The 1. runs the analyzer right after the real compilation. So, if the build
process removes removes intermediate modules (generated sources) the analyzer
output still kept.
The 2. and 3. generate the compilation database first, and filters out those
modules which are not exists. So, it's suitable for incremental analysis durring
the development.
The 2. mode is available only on FreeBSD and Linux. Where library preload
is available from the dynamic loader. Not supported on OS X (unless System
Integrity Protection feature is turned off).
`intercept-build` command uses only the 2. and 3. mode to generate the
compilation database. `analyze-build` does only run the analyzer against the
captured compiler calls.
Known problems
--------------
Because it uses `LD_PRELOAD` or `DYLD_INSERT_LIBRARIES` environment variables,
it does not append to it, but overrides it. So builds which are using these
variables might not work. (I don't know any build tool which does that, but
please let me know if you do.)
Problem reports
---------------
If you find a bug in this documentation or elsewhere in the program or would
like to propose an improvement, please use the project's [issue tracker][3].
Please describing the bug and where you found it. If you have a suggestion
how to fix it, include that as well. Patches are also welcome.
License
-------
The project is licensed under University of Illinois/NCSA Open Source License.
See LICENSE.TXT for details.
[1]: http://clang.llvm.org/docs/JSONCompilationDatabase.html
[2]: https://pypi.python.org/pypi/scan-build
[3]: https://llvm.org/bugs/enter_bug.cgi?product=clang

View File

@ -0,0 +1,17 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import multiprocessing
multiprocessing.freeze_support()
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.analyze import analyze_build
sys.exit(analyze_build())

View File

@ -0,0 +1,14 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.analyze import analyze_compiler_wrapper
sys.exit(analyze_compiler_wrapper())

View File

@ -0,0 +1,14 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.analyze import analyze_compiler_wrapper
sys.exit(analyze_compiler_wrapper())

View File

@ -0,0 +1,17 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import multiprocessing
multiprocessing.freeze_support()
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.intercept import intercept_build
sys.exit(intercept_build())

View File

@ -0,0 +1,14 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.intercept import intercept_compiler_wrapper
sys.exit(intercept_compiler_wrapper())

View File

@ -0,0 +1,14 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.intercept import intercept_compiler_wrapper
sys.exit(intercept_compiler_wrapper())

View File

@ -0,0 +1,17 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
import multiprocessing
multiprocessing.freeze_support()
import sys
import os.path
this_dir = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.dirname(this_dir))
from libscanbuild.analyze import scan_build
sys.exit(scan_build())

View File

@ -0,0 +1,260 @@
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
""" This module compiles the intercept library. """
import sys
import os
import os.path
import re
import tempfile
import shutil
import contextlib
import logging
__all__ = ['build_libear']
def build_libear(compiler, dst_dir):
""" Returns the full path to the 'libear' library. """
try:
src_dir = os.path.dirname(os.path.realpath(__file__))
toolset = make_toolset(src_dir)
toolset.set_compiler(compiler)
toolset.set_language_standard('c99')
toolset.add_definitions(['-D_GNU_SOURCE'])
configure = do_configure(toolset)
configure.check_function_exists('execve', 'HAVE_EXECVE')
configure.check_function_exists('execv', 'HAVE_EXECV')
configure.check_function_exists('execvpe', 'HAVE_EXECVPE')
configure.check_function_exists('execvp', 'HAVE_EXECVP')
configure.check_function_exists('execvP', 'HAVE_EXECVP2')
configure.check_function_exists('exect', 'HAVE_EXECT')
configure.check_function_exists('execl', 'HAVE_EXECL')
configure.check_function_exists('execlp', 'HAVE_EXECLP')
configure.check_function_exists('execle', 'HAVE_EXECLE')
configure.check_function_exists('posix_spawn', 'HAVE_POSIX_SPAWN')
configure.check_function_exists('posix_spawnp', 'HAVE_POSIX_SPAWNP')
configure.check_symbol_exists('_NSGetEnviron', 'crt_externs.h',
'HAVE_NSGETENVIRON')
configure.write_by_template(
os.path.join(src_dir, 'config.h.in'),
os.path.join(dst_dir, 'config.h'))
target = create_shared_library('ear', toolset)
target.add_include(dst_dir)
target.add_sources('ear.c')
target.link_against(toolset.dl_libraries())
target.link_against(['pthread'])
target.build_release(dst_dir)
return os.path.join(dst_dir, target.name)
except Exception:
logging.info("Could not build interception library.", exc_info=True)
return None
def execute(cmd, *args, **kwargs):
""" Make subprocess execution silent. """
import subprocess
kwargs.update({'stdout': subprocess.PIPE, 'stderr': subprocess.STDOUT})
return subprocess.check_call(cmd, *args, **kwargs)
@contextlib.contextmanager
def TemporaryDirectory(**kwargs):
name = tempfile.mkdtemp(**kwargs)
try:
yield name
finally:
shutil.rmtree(name)
class Toolset(object):
""" Abstract class to represent different toolset. """
def __init__(self, src_dir):
self.src_dir = src_dir
self.compiler = None
self.c_flags = []
def set_compiler(self, compiler):
""" part of public interface """
self.compiler = compiler
def set_language_standard(self, standard):
""" part of public interface """
self.c_flags.append('-std=' + standard)
def add_definitions(self, defines):
""" part of public interface """
self.c_flags.extend(defines)
def dl_libraries(self):
raise NotImplementedError()
def shared_library_name(self, name):
raise NotImplementedError()
def shared_library_c_flags(self, release):
extra = ['-DNDEBUG', '-O3'] if release else []
return extra + ['-fPIC'] + self.c_flags
def shared_library_ld_flags(self, release, name):
raise NotImplementedError()
class DarwinToolset(Toolset):
def __init__(self, src_dir):
Toolset.__init__(self, src_dir)
def dl_libraries(self):
return []
def shared_library_name(self, name):
return 'lib' + name + '.dylib'
def shared_library_ld_flags(self, release, name):
extra = ['-dead_strip'] if release else []
return extra + ['-dynamiclib', '-install_name', '@rpath/' + name]
class UnixToolset(Toolset):
def __init__(self, src_dir):
Toolset.__init__(self, src_dir)
def dl_libraries(self):
return []
def shared_library_name(self, name):
return 'lib' + name + '.so'
def shared_library_ld_flags(self, release, name):
extra = [] if release else []
return extra + ['-shared', '-Wl,-soname,' + name]
class LinuxToolset(UnixToolset):
def __init__(self, src_dir):
UnixToolset.__init__(self, src_dir)
def dl_libraries(self):
return ['dl']
def make_toolset(src_dir):
platform = sys.platform
if platform in {'win32', 'cygwin'}:
raise RuntimeError('not implemented on this platform')
elif platform == 'darwin':
return DarwinToolset(src_dir)
elif platform in {'linux', 'linux2'}:
return LinuxToolset(src_dir)
else:
return UnixToolset(src_dir)
class Configure(object):
def __init__(self, toolset):
self.ctx = toolset
self.results = {'APPLE': sys.platform == 'darwin'}
def _try_to_compile_and_link(self, source):
try:
with TemporaryDirectory() as work_dir:
src_file = 'check.c'
with open(os.path.join(work_dir, src_file), 'w') as handle:
handle.write(source)
execute([self.ctx.compiler, src_file] + self.ctx.c_flags,
cwd=work_dir)
return True
except Exception:
return False
def check_function_exists(self, function, name):
template = "int FUNCTION(); int main() { return FUNCTION(); }"
source = template.replace("FUNCTION", function)
logging.debug('Checking function %s', function)
found = self._try_to_compile_and_link(source)
logging.debug('Checking function %s -- %s', function,
'found' if found else 'not found')
self.results.update({name: found})
def check_symbol_exists(self, symbol, include, name):
template = """#include <INCLUDE>
int main() { return ((int*)(&SYMBOL))[0]; }"""
source = template.replace('INCLUDE', include).replace("SYMBOL", symbol)
logging.debug('Checking symbol %s', symbol)
found = self._try_to_compile_and_link(source)
logging.debug('Checking symbol %s -- %s', symbol,
'found' if found else 'not found')
self.results.update({name: found})
def write_by_template(self, template, output):
def transform(line, definitions):
pattern = re.compile(r'^#cmakedefine\s+(\S+)')
m = pattern.match(line)
if m:
key = m.group(1)
if key not in definitions or not definitions[key]:
return '/* #undef {0} */{1}'.format(key, os.linesep)
else:
return '#define {0}{1}'.format(key, os.linesep)
return line
with open(template, 'r') as src_handle:
logging.debug('Writing config to %s', output)
with open(output, 'w') as dst_handle:
for line in src_handle:
dst_handle.write(transform(line, self.results))
def do_configure(toolset):
return Configure(toolset)
class SharedLibrary(object):
def __init__(self, name, toolset):
self.name = toolset.shared_library_name(name)
self.ctx = toolset
self.inc = []
self.src = []
self.lib = []
def add_include(self, directory):
self.inc.extend(['-I', directory])
def add_sources(self, source):
self.src.append(source)
def link_against(self, libraries):
self.lib.extend(['-l' + lib for lib in libraries])
def build_release(self, directory):
for src in self.src:
logging.debug('Compiling %s', src)
execute(
[self.ctx.compiler, '-c', os.path.join(self.ctx.src_dir, src),
'-o', src + '.o'] + self.inc +
self.ctx.shared_library_c_flags(True),
cwd=directory)
logging.debug('Linking %s', self.name)
execute(
[self.ctx.compiler] + [src + '.o' for src in self.src] +
['-o', self.name] + self.lib +
self.ctx.shared_library_ld_flags(True, self.name),
cwd=directory)
def create_shared_library(name, toolset):
return SharedLibrary(name, toolset)

View File

@ -0,0 +1,23 @@
/* -*- coding: utf-8 -*-
// The LLVM Compiler Infrastructure
//
// This file is distributed under the University of Illinois Open Source
// License. See LICENSE.TXT for details.
*/
#pragma once
#cmakedefine HAVE_EXECVE
#cmakedefine HAVE_EXECV
#cmakedefine HAVE_EXECVPE
#cmakedefine HAVE_EXECVP
#cmakedefine HAVE_EXECVP2
#cmakedefine HAVE_EXECT
#cmakedefine HAVE_EXECL
#cmakedefine HAVE_EXECLP
#cmakedefine HAVE_EXECLE
#cmakedefine HAVE_POSIX_SPAWN
#cmakedefine HAVE_POSIX_SPAWNP
#cmakedefine HAVE_NSGETENVIRON
#cmakedefine APPLE

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,205 @@
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
""" This module is a collection of methods commonly used in this project. """
import collections
import functools
import json
import logging
import os
import os.path
import re
import shlex
import subprocess
import sys
ENVIRONMENT_KEY = 'INTERCEPT_BUILD'
Execution = collections.namedtuple('Execution', ['pid', 'cwd', 'cmd'])
def duplicate_check(method):
""" Predicate to detect duplicated entries.
Unique hash method can be use to detect duplicates. Entries are
represented as dictionaries, which has no default hash method.
This implementation uses a set datatype to store the unique hash values.
This method returns a method which can detect the duplicate values. """
def predicate(entry):
entry_hash = predicate.unique(entry)
if entry_hash not in predicate.state:
predicate.state.add(entry_hash)
return False
return True
predicate.unique = method
predicate.state = set()
return predicate
def run_build(command, *args, **kwargs):
""" Run and report build command execution
:param command: array of tokens
:return: exit code of the process
"""
environment = kwargs.get('env', os.environ)
logging.debug('run build %s, in environment: %s', command, environment)
exit_code = subprocess.call(command, *args, **kwargs)
logging.debug('build finished with exit code: %d', exit_code)
return exit_code
def run_command(command, cwd=None):
""" Run a given command and report the execution.
:param command: array of tokens
:param cwd: the working directory where the command will be executed
:return: output of the command
"""
def decode_when_needed(result):
""" check_output returns bytes or string depend on python version """
return result.decode('utf-8') if isinstance(result, bytes) else result
try:
directory = os.path.abspath(cwd) if cwd else os.getcwd()
logging.debug('exec command %s in %s', command, directory)
output = subprocess.check_output(command,
cwd=directory,
stderr=subprocess.STDOUT)
return decode_when_needed(output).splitlines()
except subprocess.CalledProcessError as ex:
ex.output = decode_when_needed(ex.output).splitlines()
raise ex
def reconfigure_logging(verbose_level):
""" Reconfigure logging level and format based on the verbose flag.
:param verbose_level: number of `-v` flags received by the command
:return: no return value
"""
# Exit when nothing to do.
if verbose_level == 0:
return
root = logging.getLogger()
# Tune logging level.
level = logging.WARNING - min(logging.WARNING, (10 * verbose_level))
root.setLevel(level)
# Be verbose with messages.
if verbose_level <= 3:
fmt_string = '%(name)s: %(levelname)s: %(message)s'
else:
fmt_string = '%(name)s: %(levelname)s: %(funcName)s: %(message)s'
handler = logging.StreamHandler(sys.stdout)
handler.setFormatter(logging.Formatter(fmt=fmt_string))
root.handlers = [handler]
def command_entry_point(function):
""" Decorator for command entry methods.
The decorator initialize/shutdown logging and guard on programming
errors (catch exceptions).
The decorated method can have arbitrary parameters, the return value will
be the exit code of the process. """
@functools.wraps(function)
def wrapper(*args, **kwargs):
""" Do housekeeping tasks and execute the wrapped method. """
try:
logging.basicConfig(format='%(name)s: %(message)s',
level=logging.WARNING,
stream=sys.stdout)
# This hack to get the executable name as %(name).
logging.getLogger().name = os.path.basename(sys.argv[0])
return function(*args, **kwargs)
except KeyboardInterrupt:
logging.warning('Keyboard interrupt')
return 130 # Signal received exit code for bash.
except Exception:
logging.exception('Internal error.')
if logging.getLogger().isEnabledFor(logging.DEBUG):
logging.error("Please report this bug and attach the output "
"to the bug report")
else:
logging.error("Please run this command again and turn on "
"verbose mode (add '-vvvv' as argument).")
return 64 # Some non used exit code for internal errors.
finally:
logging.shutdown()
return wrapper
def compiler_wrapper(function):
""" Implements compiler wrapper base functionality.
A compiler wrapper executes the real compiler, then implement some
functionality, then returns with the real compiler exit code.
:param function: the extra functionality what the wrapper want to
do on top of the compiler call. If it throws exception, it will be
caught and logged.
:return: the exit code of the real compiler.
The :param function: will receive the following arguments:
:param result: the exit code of the compilation.
:param execution: the command executed by the wrapper. """
def is_cxx_compiler():
""" Find out was it a C++ compiler call. Compiler wrapper names
contain the compiler type. C++ compiler wrappers ends with `c++`,
but might have `.exe` extension on windows. """
wrapper_command = os.path.basename(sys.argv[0])
return re.match(r'(.+)c\+\+(.*)', wrapper_command)
def run_compiler(executable):
""" Execute compilation with the real compiler. """
command = executable + sys.argv[1:]
logging.debug('compilation: %s', command)
result = subprocess.call(command)
logging.debug('compilation exit code: %d', result)
return result
# Get relevant parameters from environment.
parameters = json.loads(os.environ[ENVIRONMENT_KEY])
reconfigure_logging(parameters['verbose'])
# Execute the requested compilation. Do crash if anything goes wrong.
cxx = is_cxx_compiler()
compiler = parameters['cxx'] if cxx else parameters['cc']
result = run_compiler(compiler)
# Call the wrapped method and ignore it's return value.
try:
call = Execution(
pid=os.getpid(),
cwd=os.getcwd(),
cmd=['c++' if cxx else 'cc'] + sys.argv[1:])
function(result, call)
except:
logging.exception('Compiler wrapper failed complete.')
finally:
# Always return the real compiler exit code.
return result
def wrapper_environment(args):
""" Set up environment for interpose compiler wrapper."""
return {
ENVIRONMENT_KEY: json.dumps({
'verbose': args.verbose,
'cc': shlex.split(args.cc),
'cxx': shlex.split(args.cxx)
})
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,431 @@
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
""" This module parses and validates arguments for command-line interfaces.
It uses argparse module to create the command line parser. (This library is
in the standard python library since 3.2 and backported to 2.7, but not
earlier.)
It also implements basic validation methods, related to the command.
Validations are mostly calling specific help methods, or mangling values.
"""
import os
import sys
import argparse
import logging
import tempfile
from libscanbuild import reconfigure_logging
from libscanbuild.clang import get_checkers
__all__ = ['parse_args_for_intercept_build', 'parse_args_for_analyze_build',
'parse_args_for_scan_build']
def parse_args_for_intercept_build():
""" Parse and validate command-line arguments for intercept-build. """
parser = create_intercept_parser()
args = parser.parse_args()
reconfigure_logging(args.verbose)
logging.debug('Raw arguments %s', sys.argv)
# short validation logic
if not args.build:
parser.error(message='missing build command')
logging.debug('Parsed arguments: %s', args)
return args
def parse_args_for_analyze_build():
""" Parse and validate command-line arguments for analyze-build. """
from_build_command = False
parser = create_analyze_parser(from_build_command)
args = parser.parse_args()
reconfigure_logging(args.verbose)
logging.debug('Raw arguments %s', sys.argv)
normalize_args_for_analyze(args, from_build_command)
validate_args_for_analyze(parser, args, from_build_command)
logging.debug('Parsed arguments: %s', args)
return args
def parse_args_for_scan_build():
""" Parse and validate command-line arguments for scan-build. """
from_build_command = True
parser = create_analyze_parser(from_build_command)
args = parser.parse_args()
reconfigure_logging(args.verbose)
logging.debug('Raw arguments %s', sys.argv)
normalize_args_for_analyze(args, from_build_command)
validate_args_for_analyze(parser, args, from_build_command)
logging.debug('Parsed arguments: %s', args)
return args
def normalize_args_for_analyze(args, from_build_command):
""" Normalize parsed arguments for analyze-build and scan-build.
:param args: Parsed argument object. (Will be mutated.)
:param from_build_command: Boolean value tells is the command suppose
to run the analyzer against a build command or a compilation db. """
# make plugins always a list. (it might be None when not specified.)
if args.plugins is None:
args.plugins = []
# make exclude directory list unique and absolute.
uniq_excludes = set(os.path.abspath(entry) for entry in args.excludes)
args.excludes = list(uniq_excludes)
# because shared codes for all tools, some common used methods are
# expecting some argument to be present. so, instead of query the args
# object about the presence of the flag, we fake it here. to make those
# methods more readable. (it's an arguable choice, took it only for those
# which have good default value.)
if from_build_command:
# add cdb parameter invisibly to make report module working.
args.cdb = 'compile_commands.json'
def validate_args_for_analyze(parser, args, from_build_command):
""" Command line parsing is done by the argparse module, but semantic
validation still needs to be done. This method is doing it for
analyze-build and scan-build commands.
:param parser: The command line parser object.
:param args: Parsed argument object.
:param from_build_command: Boolean value tells is the command suppose
to run the analyzer against a build command or a compilation db.
:return: No return value, but this call might throw when validation
fails. """
if args.help_checkers_verbose:
print_checkers(get_checkers(args.clang, args.plugins))
parser.exit(status=0)
elif args.help_checkers:
print_active_checkers(get_checkers(args.clang, args.plugins))
parser.exit(status=0)
elif from_build_command and not args.build:
parser.error(message='missing build command')
elif not from_build_command and not os.path.exists(args.cdb):
parser.error(message='compilation database is missing')
def create_intercept_parser():
""" Creates a parser for command-line arguments to 'intercept'. """
parser = create_default_parser()
parser_add_cdb(parser)
parser_add_prefer_wrapper(parser)
parser_add_compilers(parser)
advanced = parser.add_argument_group('advanced options')
group = advanced.add_mutually_exclusive_group()
group.add_argument(
'--append',
action='store_true',
help="""Extend existing compilation database with new entries.
Duplicate entries are detected and not present in the final output.
The output is not continuously updated, it's done when the build
command finished. """)
parser.add_argument(
dest='build', nargs=argparse.REMAINDER, help="""Command to run.""")
return parser
def create_analyze_parser(from_build_command):
""" Creates a parser for command-line arguments to 'analyze'. """
parser = create_default_parser()
if from_build_command:
parser_add_prefer_wrapper(parser)
parser_add_compilers(parser)
parser.add_argument(
'--intercept-first',
action='store_true',
help="""Run the build commands first, intercept compiler
calls and then run the static analyzer afterwards.
Generally speaking it has better coverage on build commands.
With '--override-compiler' it use compiler wrapper, but does
not run the analyzer till the build is finished.""")
else:
parser_add_cdb(parser)
parser.add_argument(
'--status-bugs',
action='store_true',
help="""The exit status of '%(prog)s' is the same as the executed
build command. This option ignores the build exit status and sets to
be non zero if it found potential bugs or zero otherwise.""")
parser.add_argument(
'--exclude',
metavar='<directory>',
dest='excludes',
action='append',
default=[],
help="""Do not run static analyzer against files found in this
directory. (You can specify this option multiple times.)
Could be useful when project contains 3rd party libraries.""")
output = parser.add_argument_group('output control options')
output.add_argument(
'--output',
'-o',
metavar='<path>',
default=tempfile.gettempdir(),
help="""Specifies the output directory for analyzer reports.
Subdirectory will be created if default directory is targeted.""")
output.add_argument(
'--keep-empty',
action='store_true',
help="""Don't remove the build results directory even if no issues
were reported.""")
output.add_argument(
'--html-title',
metavar='<title>',
help="""Specify the title used on generated HTML pages.
If not specified, a default title will be used.""")
format_group = output.add_mutually_exclusive_group()
format_group.add_argument(
'--plist',
'-plist',
dest='output_format',
const='plist',
default='html',
action='store_const',
help="""Cause the results as a set of .plist files.""")
format_group.add_argument(
'--plist-html',
'-plist-html',
dest='output_format',
const='plist-html',
default='html',
action='store_const',
help="""Cause the results as a set of .html and .plist files.""")
# TODO: implement '-view '
advanced = parser.add_argument_group('advanced options')
advanced.add_argument(
'--use-analyzer',
metavar='<path>',
dest='clang',
default='clang',
help="""'%(prog)s' uses the 'clang' executable relative to itself for
static analysis. One can override this behavior with this option by
using the 'clang' packaged with Xcode (on OS X) or from the PATH.""")
advanced.add_argument(
'--no-failure-reports',
'-no-failure-reports',
dest='output_failures',
action='store_false',
help="""Do not create a 'failures' subdirectory that includes analyzer
crash reports and preprocessed source files.""")
parser.add_argument(
'--analyze-headers',
action='store_true',
help="""Also analyze functions in #included files. By default, such
functions are skipped unless they are called by functions within the
main source file.""")
advanced.add_argument(
'--stats',
'-stats',
action='store_true',
help="""Generates visitation statistics for the project.""")
advanced.add_argument(
'--internal-stats',
action='store_true',
help="""Generate internal analyzer statistics.""")
advanced.add_argument(
'--maxloop',
'-maxloop',
metavar='<loop count>',
type=int,
help="""Specifiy the number of times a block can be visited before
giving up. Increase for more comprehensive coverage at a cost of
speed.""")
advanced.add_argument(
'--store',
'-store',
metavar='<model>',
dest='store_model',
choices=['region', 'basic'],
help="""Specify the store model used by the analyzer. 'region'
specifies a field- sensitive store model. 'basic' which is far less
precise but can more quickly analyze code. 'basic' was the default
store model for checker-0.221 and earlier.""")
advanced.add_argument(
'--constraints',
'-constraints',
metavar='<model>',
dest='constraints_model',
choices=['range', 'basic'],
help="""Specify the constraint engine used by the analyzer. Specifying
'basic' uses a simpler, less powerful constraint model used by
checker-0.160 and earlier.""")
advanced.add_argument(
'--analyzer-config',
'-analyzer-config',
metavar='<options>',
help="""Provide options to pass through to the analyzer's
-analyzer-config flag. Several options are separated with comma:
'key1=val1,key2=val2'
Available options:
stable-report-filename=true or false (default)
Switch the page naming to:
report-<filename>-<function/method name>-<id>.html
instead of report-XXXXXX.html""")
advanced.add_argument(
'--force-analyze-debug-code',
dest='force_debug',
action='store_true',
help="""Tells analyzer to enable assertions in code even if they were
disabled during compilation, enabling more precise results.""")
plugins = parser.add_argument_group('checker options')
plugins.add_argument(
'--load-plugin',
'-load-plugin',
metavar='<plugin library>',
dest='plugins',
action='append',
help="""Loading external checkers using the clang plugin interface.""")
plugins.add_argument(
'--enable-checker',
'-enable-checker',
metavar='<checker name>',
action=AppendCommaSeparated,
help="""Enable specific checker.""")
plugins.add_argument(
'--disable-checker',
'-disable-checker',
metavar='<checker name>',
action=AppendCommaSeparated,
help="""Disable specific checker.""")
plugins.add_argument(
'--help-checkers',
action='store_true',
help="""A default group of checkers is run unless explicitly disabled.
Exactly which checkers constitute the default group is a function of
the operating system in use. These can be printed with this flag.""")
plugins.add_argument(
'--help-checkers-verbose',
action='store_true',
help="""Print all available checkers and mark the enabled ones.""")
if from_build_command:
parser.add_argument(
dest='build', nargs=argparse.REMAINDER, help="""Command to run.""")
return parser
def create_default_parser():
""" Creates command line parser for all build wrapper commands. """
parser = argparse.ArgumentParser(
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument(
'--verbose',
'-v',
action='count',
default=0,
help="""Enable verbose output from '%(prog)s'. A second, third and
fourth flags increases verbosity.""")
return parser
def parser_add_cdb(parser):
parser.add_argument(
'--cdb',
metavar='<file>',
default="compile_commands.json",
help="""The JSON compilation database.""")
def parser_add_prefer_wrapper(parser):
parser.add_argument(
'--override-compiler',
action='store_true',
help="""Always resort to the compiler wrapper even when better
intercept methods are available.""")
def parser_add_compilers(parser):
parser.add_argument(
'--use-cc',
metavar='<path>',
dest='cc',
default=os.getenv('CC', 'cc'),
help="""When '%(prog)s' analyzes a project by interposing a compiler
wrapper, which executes a real compiler for compilation and do other
tasks (record the compiler invocation). Because of this interposing,
'%(prog)s' does not know what compiler your project normally uses.
Instead, it simply overrides the CC environment variable, and guesses
your default compiler.
If you need '%(prog)s' to use a specific compiler for *compilation*
then you can use this option to specify a path to that compiler.""")
parser.add_argument(
'--use-c++',
metavar='<path>',
dest='cxx',
default=os.getenv('CXX', 'c++'),
help="""This is the same as "--use-cc" but for C++ code.""")
class AppendCommaSeparated(argparse.Action):
""" argparse Action class to support multiple comma separated lists. """
def __call__(self, __parser, namespace, values, __option_string):
# getattr(obj, attr, default) does not really returns default but none
if getattr(namespace, self.dest, None) is None:
setattr(namespace, self.dest, [])
# once it's fixed we can use as expected
actual = getattr(namespace, self.dest)
actual.extend(values.split(','))
setattr(namespace, self.dest, actual)
def print_active_checkers(checkers):
""" Print active checkers to stdout. """
for name in sorted(name for name, (_, active) in checkers.items()
if active):
print(name)
def print_checkers(checkers):
""" Print verbose checker help to stdout. """
print('')
print('available checkers:')
print('')
for name in sorted(checkers.keys()):
description, active = checkers[name]
prefix = '+' if active else ' '
if len(name) > 30:
print(' {0} {1}'.format(prefix, name))
print(' ' * 35 + description)
else:
print(' {0} {1: <30} {2}'.format(prefix, name, description))
print('')
print('NOTE: "+" indicates that an analysis is enabled by default.')
print('')

View File

@ -0,0 +1,154 @@
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
""" This module is responsible for the Clang executable.
Since Clang command line interface is so rich, but this project is using only
a subset of that, it makes sense to create a function specific wrapper. """
import re
from libscanbuild import run_command
from libscanbuild.shell import decode
__all__ = ['get_version', 'get_arguments', 'get_checkers']
# regex for activated checker
ACTIVE_CHECKER_PATTERN = re.compile(r'^-analyzer-checker=(.*)$')
def get_version(clang):
""" Returns the compiler version as string.
:param clang: the compiler we are using
:return: the version string printed to stderr """
output = run_command([clang, '-v'])
# the relevant version info is in the first line
return output[0]
def get_arguments(command, cwd):
""" Capture Clang invocation.
:param command: the compilation command
:param cwd: the current working directory
:return: the detailed front-end invocation command """
cmd = command[:]
cmd.insert(1, '-###')
output = run_command(cmd, cwd=cwd)
# The relevant information is in the last line of the output.
# Don't check if finding last line fails, would throw exception anyway.
last_line = output[-1]
if re.search(r'clang(.*): error:', last_line):
raise Exception(last_line)
return decode(last_line)
def get_active_checkers(clang, plugins):
""" Get the active checker list.
:param clang: the compiler we are using
:param plugins: list of plugins which was requested by the user
:return: list of checker names which are active
To get the default checkers we execute Clang to print how this
compilation would be called. And take out the enabled checker from the
arguments. For input file we specify stdin and pass only language
information. """
def get_active_checkers_for(language):
""" Returns a list of active checkers for the given language. """
load_args = [arg
for plugin in plugins
for arg in ['-Xclang', '-load', '-Xclang', plugin]]
cmd = [clang, '--analyze'] + load_args + ['-x', language, '-']
return [ACTIVE_CHECKER_PATTERN.match(arg).group(1)
for arg in get_arguments(cmd, '.')
if ACTIVE_CHECKER_PATTERN.match(arg)]
result = set()
for language in ['c', 'c++', 'objective-c', 'objective-c++']:
result.update(get_active_checkers_for(language))
return frozenset(result)
def is_active(checkers):
""" Returns a method, which classifies the checker active or not,
based on the received checker name list. """
def predicate(checker):
""" Returns True if the given checker is active. """
return any(pattern.match(checker) for pattern in predicate.patterns)
predicate.patterns = [re.compile(r'^' + a + r'(\.|$)') for a in checkers]
return predicate
def parse_checkers(stream):
""" Parse clang -analyzer-checker-help output.
Below the line 'CHECKERS:' are there the name description pairs.
Many of them are in one line, but some long named checker has the
name and the description in separate lines.
The checker name is always prefixed with two space character. The
name contains no whitespaces. Then followed by newline (if it's
too long) or other space characters comes the description of the
checker. The description ends with a newline character.
:param stream: list of lines to parse
:return: generator of tuples
(<checker name>, <checker description>) """
lines = iter(stream)
# find checkers header
for line in lines:
if re.match(r'^CHECKERS:', line):
break
# find entries
state = None
for line in lines:
if state and not re.match(r'^\s\s\S', line):
yield (state, line.strip())
state = None
elif re.match(r'^\s\s\S+$', line.rstrip()):
state = line.strip()
else:
pattern = re.compile(r'^\s\s(?P<key>\S*)\s*(?P<value>.*)')
match = pattern.match(line.rstrip())
if match:
current = match.groupdict()
yield (current['key'], current['value'])
def get_checkers(clang, plugins):
""" Get all the available checkers from default and from the plugins.
:param clang: the compiler we are using
:param plugins: list of plugins which was requested by the user
:return: a dictionary of all available checkers and its status
{<checker name>: (<checker description>, <is active by default>)} """
load = [elem for plugin in plugins for elem in ['-load', plugin]]
cmd = [clang, '-cc1'] + load + ['-analyzer-checker-help']
lines = run_command(cmd)
is_active_checker = is_active(get_active_checkers(clang, plugins))
checkers = {
name: (description, is_active_checker(name))
for name, description in parse_checkers(lines)
}
if not checkers:
raise Exception('Could not query Clang for available checkers.')
return checkers

View File

@ -0,0 +1,141 @@
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
""" This module is responsible for to parse a compiler invocation. """
import re
import os
import collections
__all__ = ['split_command', 'classify_source', 'compiler_language']
# Ignored compiler options map for compilation database creation.
# The map is used in `split_command` method. (Which does ignore and classify
# parameters.) Please note, that these are not the only parameters which
# might be ignored.
#
# Keys are the option name, value number of options to skip
IGNORED_FLAGS = {
# compiling only flag, ignored because the creator of compilation
# database will explicitly set it.
'-c': 0,
# preprocessor macros, ignored because would cause duplicate entries in
# the output (the only difference would be these flags). this is actual
# finding from users, who suffered longer execution time caused by the
# duplicates.
'-MD': 0,
'-MMD': 0,
'-MG': 0,
'-MP': 0,
'-MF': 1,
'-MT': 1,
'-MQ': 1,
# linker options, ignored because for compilation database will contain
# compilation commands only. so, the compiler would ignore these flags
# anyway. the benefit to get rid of them is to make the output more
# readable.
'-static': 0,
'-shared': 0,
'-s': 0,
'-rdynamic': 0,
'-l': 1,
'-L': 1,
'-u': 1,
'-z': 1,
'-T': 1,
'-Xlinker': 1
}
# Known C/C++ compiler executable name patterns
COMPILER_PATTERNS = frozenset([
re.compile(r'^(intercept-|analyze-|)c(c|\+\+)$'),
re.compile(r'^([^-]*-)*[mg](cc|\+\+)(-\d+(\.\d+){0,2})?$'),
re.compile(r'^([^-]*-)*clang(\+\+)?(-\d+(\.\d+){0,2})?$'),
re.compile(r'^llvm-g(cc|\+\+)$'),
])
def split_command(command):
""" Returns a value when the command is a compilation, None otherwise.
The value on success is a named tuple with the following attributes:
files: list of source files
flags: list of compile options
compiler: string value of 'c' or 'c++' """
# the result of this method
result = collections.namedtuple('Compilation',
['compiler', 'flags', 'files'])
result.compiler = compiler_language(command)
result.flags = []
result.files = []
# quit right now, if the program was not a C/C++ compiler
if not result.compiler:
return None
# iterate on the compile options
args = iter(command[1:])
for arg in args:
# quit when compilation pass is not involved
if arg in {'-E', '-S', '-cc1', '-M', '-MM', '-###'}:
return None
# ignore some flags
elif arg in IGNORED_FLAGS:
count = IGNORED_FLAGS[arg]
for _ in range(count):
next(args)
elif re.match(r'^-(l|L|Wl,).+', arg):
pass
# some parameters could look like filename, take as compile option
elif arg in {'-D', '-I'}:
result.flags.extend([arg, next(args)])
# parameter which looks source file is taken...
elif re.match(r'^[^-].+', arg) and classify_source(arg):
result.files.append(arg)
# and consider everything else as compile option.
else:
result.flags.append(arg)
# do extra check on number of source files
return result if result.files else None
def classify_source(filename, c_compiler=True):
""" Return the language from file name extension. """
mapping = {
'.c': 'c' if c_compiler else 'c++',
'.i': 'c-cpp-output' if c_compiler else 'c++-cpp-output',
'.ii': 'c++-cpp-output',
'.m': 'objective-c',
'.mi': 'objective-c-cpp-output',
'.mm': 'objective-c++',
'.mii': 'objective-c++-cpp-output',
'.C': 'c++',
'.cc': 'c++',
'.CC': 'c++',
'.cp': 'c++',
'.cpp': 'c++',
'.cxx': 'c++',
'.c++': 'c++',
'.C++': 'c++',
'.txx': 'c++'
}
__, extension = os.path.splitext(os.path.basename(filename))
return mapping.get(extension)
def compiler_language(command):
""" A predicate to decide the command is a compiler call or not.
Returns 'c' or 'c++' when it match. None otherwise. """
cplusplus = re.compile(r'^(.+)(\+\+)(-.+|)$')
if command:
executable = os.path.basename(command[0])
if any(pattern.match(executable) for pattern in COMPILER_PATTERNS):
return 'c++' if cplusplus.match(executable) else 'c'
return None

View File

@ -0,0 +1,263 @@
# -*- coding: utf-8 -*-
# The LLVM Compiler Infrastructure
#
# This file is distributed under the University of Illinois Open Source
# License. See LICENSE.TXT for details.
""" This module is responsible to capture the compiler invocation of any
build process. The result of that should be a compilation database.
This implementation is using the LD_PRELOAD or DYLD_INSERT_LIBRARIES
mechanisms provided by the dynamic linker. The related library is implemented
in C language and can be found under 'libear' directory.
The 'libear' library is capturing all child process creation and logging the
relevant information about it into separate files in a specified directory.
The parameter of this process is the output directory name, where the report
files shall be placed. This parameter is passed as an environment variable.
The module also implements compiler wrappers to intercept the compiler calls.
The module implements the build command execution and the post-processing of
the output files, which will condensates into a compilation database. """
import sys
import os
import os.path
import re
import itertools
import json
import glob
import logging
from libear import build_libear, TemporaryDirectory
from libscanbuild import command_entry_point, compiler_wrapper, \
wrapper_environment, run_command, run_build
from libscanbuild import duplicate_check
from libscanbuild.compilation import split_command
from libscanbuild.arguments import parse_args_for_intercept_build
from libscanbuild.shell import encode, decode
__all__ = ['capture', 'intercept_build', 'intercept_compiler_wrapper']
GS = chr(0x1d)
RS = chr(0x1e)
US = chr(0x1f)
COMPILER_WRAPPER_CC = 'intercept-cc'
COMPILER_WRAPPER_CXX = 'intercept-c++'
TRACE_FILE_EXTENSION = '.cmd' # same as in ear.c
WRAPPER_ONLY_PLATFORMS = frozenset({'win32', 'cygwin'})
@command_entry_point
def intercept_build():
""" Entry point for 'intercept-build' command. """
args = parse_args_for_intercept_build()
return capture(args)
def capture(args):
""" The entry point of build command interception. """
def post_processing(commands):
""" To make a compilation database, it needs to filter out commands
which are not compiler calls. Needs to find the source file name
from the arguments. And do shell escaping on the command.
To support incremental builds, it is desired to read elements from
an existing compilation database from a previous run. These elements
shall be merged with the new elements. """
# create entries from the current run
current = itertools.chain.from_iterable(
# creates a sequence of entry generators from an exec,
format_entry(command) for command in commands)
# read entries from previous run
if 'append' in args and args.append and os.path.isfile(args.cdb):
with open(args.cdb) as handle:
previous = iter(json.load(handle))
else:
previous = iter([])
# filter out duplicate entries from both
duplicate = duplicate_check(entry_hash)
return (entry
for entry in itertools.chain(previous, current)
if os.path.exists(entry['file']) and not duplicate(entry))
with TemporaryDirectory(prefix='intercept-') as tmp_dir:
# run the build command
environment = setup_environment(args, tmp_dir)
exit_code = run_build(args.build, env=environment)
# read the intercepted exec calls
exec_traces = itertools.chain.from_iterable(
parse_exec_trace(os.path.join(tmp_dir, filename))
for filename in sorted(glob.iglob(os.path.join(tmp_dir, '*.cmd'))))
# do post processing
entries = post_processing(exec_traces)
# dump the compilation database
with open(args.cdb, 'w+') as handle:
json.dump(list(entries), handle, sort_keys=True, indent=4)
return exit_code
def setup_environment(args, destination):
""" Sets up the environment for the build command.
It sets the required environment variables and execute the given command.
The exec calls will be logged by the 'libear' preloaded library or by the
'wrapper' programs. """
c_compiler = args.cc if 'cc' in args else 'cc'
cxx_compiler = args.cxx if 'cxx' in args else 'c++'
libear_path = None if args.override_compiler or is_preload_disabled(
sys.platform) else build_libear(c_compiler, destination)
environment = dict(os.environ)
environment.update({'INTERCEPT_BUILD_TARGET_DIR': destination})
if not libear_path:
logging.debug('intercept gonna use compiler wrappers')
environment.update(wrapper_environment(args))
environment.update({
'CC': COMPILER_WRAPPER_CC,
'CXX': COMPILER_WRAPPER_CXX
})
elif sys.platform == 'darwin':
logging.debug('intercept gonna preload libear on OSX')
environment.update({
'DYLD_INSERT_LIBRARIES': libear_path,
'DYLD_FORCE_FLAT_NAMESPACE': '1'
})
else:
logging.debug('intercept gonna preload libear on UNIX')
environment.update({'LD_PRELOAD': libear_path})
return environment
@command_entry_point
def intercept_compiler_wrapper():
""" Entry point for `intercept-cc` and `intercept-c++`. """
return compiler_wrapper(intercept_compiler_wrapper_impl)
def intercept_compiler_wrapper_impl(_, execution):
""" Implement intercept compiler wrapper functionality.
It does generate execution report into target directory.
The target directory name is from environment variables. """
message_prefix = 'execution report might be incomplete: %s'
target_dir = os.getenv('INTERCEPT_BUILD_TARGET_DIR')
if not target_dir:
logging.warning(message_prefix, 'missing target directory')
return
# write current execution info to the pid file
try:
target_file_name = str(os.getpid()) + TRACE_FILE_EXTENSION
target_file = os.path.join(target_dir, target_file_name)
logging.debug('writing execution report to: %s', target_file)
write_exec_trace(target_file, execution)
except IOError:
logging.warning(message_prefix, 'io problem')
def write_exec_trace(filename, entry):
""" Write execution report file.
This method shall be sync with the execution report writer in interception
library. The entry in the file is a JSON objects.
:param filename: path to the output execution trace file,
:param entry: the Execution object to append to that file. """
with open(filename, 'ab') as handler:
pid = str(entry.pid)
command = US.join(entry.cmd) + US
content = RS.join([pid, pid, 'wrapper', entry.cwd, command]) + GS
handler.write(content.encode('utf-8'))
def parse_exec_trace(filename):
""" Parse the file generated by the 'libear' preloaded library.
Given filename points to a file which contains the basic report
generated by the interception library or wrapper command. A single
report file _might_ contain multiple process creation info. """
logging.debug('parse exec trace file: %s', filename)
with open(filename, 'r') as handler:
content = handler.read()
for group in filter(bool, content.split(GS)):
records = group.split(RS)
yield {
'pid': records[0],
'ppid': records[1],
'function': records[2],
'directory': records[3],
'command': records[4].split(US)[:-1]
}
def format_entry(exec_trace):
""" Generate the desired fields for compilation database entries. """
def abspath(cwd, name):
""" Create normalized absolute path from input filename. """
fullname = name if os.path.isabs(name) else os.path.join(cwd, name)
return os.path.normpath(fullname)
logging.debug('format this command: %s', exec_trace['command'])
compilation = split_command(exec_trace['command'])
if compilation:
for source in compilation.files:
compiler = 'c++' if compilation.compiler == 'c++' else 'cc'
command = [compiler, '-c'] + compilation.flags + [source]
logging.debug('formated as: %s', command)
yield {
'directory': exec_trace['directory'],
'command': encode(command),
'file': abspath(exec_trace['directory'], source)
}
def is_preload_disabled(platform):
""" Library-based interposition will fail silently if SIP is enabled,
so this should be detected. You can detect whether SIP is enabled on
Darwin by checking whether (1) there is a binary called 'csrutil' in
the path and, if so, (2) whether the output of executing 'csrutil status'
contains 'System Integrity Protection status: enabled'.
:param platform: name of the platform (returned by sys.platform),
:return: True if library preload will fail by the dynamic linker. """
if platform in WRAPPER_ONLY_PLATFORMS:
return True
elif platform == 'darwin':
command = ['csrutil', 'status']
pattern = re.compile(r'System Integrity Protection status:\s+enabled')
try:
return any(pattern.match(line) for line in run_command(command))
except:
return False
else:
return False
def entry_hash(entry):
""" Implement unique hash method for compilation database entries. """
# For faster lookup in set filename is reverted
filename = entry['file'][::-1]
# For faster lookup in set directory is reverted
directory = entry['directory'][::-1]
# On OS X the 'cc' and 'c++' compilers are wrappers for
# 'clang' therefore both call would be logged. To avoid
# this the hash does not contain the first word of the
# command.
command = ' '.join(decode(entry['command'])[1:])
return '<>'.join([filename, directory, command])

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,62 @@
body { color:#000000; background-color:#ffffff }
body { font-family: Helvetica, sans-serif; font-size:9pt }
h1 { font-size: 14pt; }
h2 { font-size: 12pt; }
table { font-size:9pt }
table { border-spacing: 0px; border: 1px solid black }
th, table thead {
background-color:#eee; color:#666666;
font-weight: bold; cursor: default;
text-align:center;
font-weight: bold; font-family: Verdana;
white-space:nowrap;
}
.W { font-size:0px }
th, td { padding:5px; padding-left:8px; text-align:left }
td.SUMM_DESC { padding-left:12px }
td.DESC { white-space:pre }
td.Q { text-align:right }
td { text-align:left }
tbody.scrollContent { overflow:auto }
table.form_group {
background-color: #ccc;
border: 1px solid #333;
padding: 2px;
}
table.form_inner_group {
background-color: #ccc;
border: 1px solid #333;
padding: 0px;
}
table.form {
background-color: #999;
border: 1px solid #333;
padding: 2px;
}
td.form_label {
text-align: right;
vertical-align: top;
}
/* For one line entires */
td.form_clabel {
text-align: right;
vertical-align: center;
}
td.form_value {
text-align: left;
vertical-align: top;
}
td.form_submit {
text-align: right;
vertical-align: top;
}
h1.SubmitFail {
color: #f00;
}
h1.SubmitOk {
}

View File

@ -0,0 +1,47 @@
function SetDisplay(RowClass, DisplayVal)
{
var Rows = document.getElementsByTagName("tr");
for ( var i = 0 ; i < Rows.length; ++i ) {
if (Rows[i].className == RowClass) {
Rows[i].style.display = DisplayVal;
}
}
}
function CopyCheckedStateToCheckButtons(SummaryCheckButton) {
var Inputs = document.getElementsByTagName("input");
for ( var i = 0 ; i < Inputs.length; ++i ) {
if (Inputs[i].type == "checkbox") {
if(Inputs[i] != SummaryCheckButton) {
Inputs[i].checked = SummaryCheckButton.checked;
Inputs[i].onclick();
}
}
}
}
function returnObjById( id ) {
if (document.getElementById)
var returnVar = document.getElementById(id);
else if (document.all)
var returnVar = document.all[id];
else if (document.layers)
var returnVar = document.layers[id];
return returnVar;
}
var NumUnchecked = 0;
function ToggleDisplay(CheckButton, ClassName) {
if (CheckButton.checked) {
SetDisplay(ClassName, "");
if (--NumUnchecked == 0) {
returnObjById("AllBugsCheck").checked = true;
}
}
else {
SetDisplay(ClassName, "none");
NumUnchecked++;
returnObjById("AllBugsCheck").checked = false;
}
}

Some files were not shown because too many files have changed in this diff Show More