Build capture utility for Linux

Overview

CX-BUILD

Compilation Database alternative

Build

Prerequisite

the CXBUILD uses linux system call trace utility called strace which was customized. So If you want use the cxbuild, you have to build the custom strace(It prints whole path to recognize build commands, https://github.com/damho1104/strace.git)

mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/usr/local ..
make -j8

When after you build 'cstrace', please copy cstrace binary into '/usr/bin' or '/bin' directory(or add $PATH environement variable) to make the cxbuild freely invokes.

Export single portable binary

Sometimes python and python module requirements is annoying. the pyinstaller can generate everything in single binary. It is really helpful in many cases.

To do this, Required Python 3.x(we tested 3.6.9) and above version. when you ready to build the cxbuild binary, at the source root directory, type below:

python3 -mvenv python
source python/bin/activate
pip install wheel
pip install -r requirements.txt

pip install pyinstaller
pyinstaller build.spec
chmod +x dist/cxbuild
cp dist/cxbuild /usr/local/bin

or Just use cxbuild with python interpreter like this:

source env/bin/activate
pip install -r requirements.txt
python cxbuild.py capture make -j 8

Usage

Capture command

When you ready build your repository, you can use 'capture' command, to make compilation database

cxbuild capture [build command here]

If succeed, "[pwd]/.xdb/compile-commands.json" file will be generated(some intermediate file exist).

Captured command

You can run post process only after build capture completed, It only required when debug something.

The 'captured' command assumes that there exists trace files(full_trace.log) in .xdb directory. and This command run only post process phase, It will generate compile_commands.json file from full_trace.log.

cxbuild captured

Try it

Assumes that 'cstrace' and 'cxbuild' exist in $PATH environment

/bin/cstrace
/bin/cxbuild

gzip

Prepare source build (in Ubuntu OS, apt source command will serve source files to build):

$ apt source gzip
$ls -al
total 1084
drwxr-xr-x  3 minhyuk minhyuk    4096 Nov  1 10:51 .
drwxr-xr-x 12 minhyuk minhyuk    4096 Nov  1 10:51 ..
drwxr-xr-x 11 minhyuk minhyuk    4096 Nov  1 10:52 gzip-1.6
-rw-r--r--  1 minhyuk minhyuk   15604 Jul  8 01:53 gzip_1.6-5ubuntu1.1.debian.tar.xz
-rw-r--r--  1 minhyuk minhyuk    2060 Jul  8 01:53 gzip_1.6-5ubuntu1.1.dsc
-rw-r--r--  1 minhyuk minhyuk 1074924 Aug 21  2013 gzip_1.6.orig.tar.gz

then prepare build tools

$ cd gzip-1.6
$ ./configure

actual build will with cxbuild just type like this:

Making all in tests make[2]: Entering directory '/home/minhyuk/cx/tests/gzip-1.6/tests' make[2]: Nothing to be done for 'all'. make[2]: Leaving directory '/home/minhyuk/cx/tests/gzip-1.6/tests' make[1]: Leaving directory '/home/minhyuk/cx/tests/gzip-1.6' -- [Analyzing build Activities] /home/minhyuk/cx/tests/gzip-1.6/.xdb/xtrace_tree.json > Finished(00:00:09) ">
$ cxbuild capture make -j 8
> eXtension of Compilation Database
-- [Build]
make -j 8
cstrace -s 65535 -e trace=open,openat,execve -v -y -f -o "/home/minhyuk/cx/tests/gzip-1.6/.xdb/full_cstrace.log" -- make -j 8
  GEN      version.c
  GEN      version.h
make  all-recursive
make[1]: Entering directory '/home/minhyuk/cx/tests/gzip-1.6'
Making all in lib
make[2]: Entering directory '/home/minhyuk/cx/tests/gzip-1.6/lib'
  GEN      alloca.h
  GEN      configmake.h
  GEN      c++defs.h
  GEN      arg-nonnull.h
  GEN      warn-on-use.h
  GEN      unused-parameter.h
<...skipped...>
Making all in tests
make[2]: Entering directory '/home/minhyuk/cx/tests/gzip-1.6/tests'
make[2]: Nothing to be done for 'all'.
make[2]: Leaving directory '/home/minhyuk/cx/tests/gzip-1.6/tests'
make[1]: Leaving directory '/home/minhyuk/cx/tests/gzip-1.6'
-- [Analyzing build Activities]
/home/minhyuk/cx/tests/gzip-1.6/.xdb/xtrace_tree.json
> Finished(00:00:09)

Check compile_commands.json:

] ">
$ cat .xdb/compile_commands.json
[
   {
       "directory": "/home/minhyuk/cx/tests/gzip-1.6/lib",
       "command": "gcc -D HAVE_CONFIG_H -I /home/minhyuk/cx/tests/gzip-1.6/lib -g -O2 -MT /home/minhyuk/cx/tests/gzip-1.6/lib/c-strcasecmp.o -MD -MP -MF .deps/c-strcasecmp.Tpo -c -o /home/minhyuk/cx/tests/gzip-1.6/lib/c-strcasecmp.o /home/minhyuk/cx/tests/gzip-1.6/lib/c-strcasecmp.c",
       "file": "c-strcasecmp.c"
   },
   {
       "directory": "/home/minhyuk/cx/tests/gzip-1.6/lib",
       "command": "gcc -D HAVE_CONFIG_H -I /home/minhyuk/cx/tests/gzip-1.6/lib -g -O2 -MT /home/minhyuk/cx/tests/gzip-1.6/lib/c-ctype.o -MD -MP -MF .deps/c-ctype.Tpo -c -o /home/minhyuk/cx/tests/gzip-1.6/lib/c-ctype.o /home/minhyuk/cx/tests/gzip-1.6/lib/c-ctype.c",
       "file": "c-ctype.c"
   },
   {
       "directory": "/home/minhyuk/cx/tests/gzip-1.6/lib",
       "command": "gcc -D HAVE_CONFIG_H -I /home/minhyuk/cx/tests/gzip-1.6/lib -g -O2 -MT /home/minhyuk/cx/tests/gzip-1.6/lib/cloexec.o -MD -MP -MF .deps/cloexec.Tpo -c -o /home/minhyuk/cx/tests/gzip-1.6/lib/cloexec.o /home/minhyuk/cx/tests/gzip-1.6/lib/cloexec.c",
       "file": "cloexec.c"
   },
   {
       "directory": "/home/minhyuk/cx/tests/gzip-1.6/lib",
       "command": "gcc -D HAVE_CONFIG_H -I /home/minhyuk/cx/tests/gzip-1.6/lib -g -O2 -MT /home/minhyuk/cx/tests/gzip-1.6/lib/c-strncasecmp.o -MD -MP -MF .deps/c-strncasecmp.Tpo -c -o /home/minhyuk/cx/tests/gzip-1.6/lib/c-strncasecmp.o /home/minhyuk/cx/tests/gzip-1.6/lib/c-strncasecmp.c",
       "file": "c-strncasecmp.c"
   },
   <...skipped...>
]
You might also like...
MongoDB utility to inflate the contents of small collection to a new larger collection

MongoDB Data Inflater ("data-inflater") The data-inflater tool is a MongoDB utility to automate the creation of a new large database collection using

A utility tool to create .env files

A utility tool to create .env files dump-env takes an .env.template file and some optional environmental variables to create a new .env file from thes

Yet another retry utility in Python

Yet another retry utility in Python, avereno being the Malagasy word for retry.

The git for the Python Story Utility Package library.

SUP The git for the Python Story Utility Package library. Installation: Install SUP by simply running pip install psup in your terminal. Check out our

A utility that makes it easy to work with Python projects containing lots of packages, of which you only want to develop some.

Mixed development source packages on top of stable constraints using pip mxdev [mɪks dɛv] is a utility that makes it easy to work with Python projects

A collection of utility functions to prototype geometry processing research in python

gpytoolbox This repo is a work in progress and contains general utility functions I have needed to code while trying to work on geometry process resea

A morse code encoder and decoder utility.

morsedecode A morse code encoder and decoder utility. Installation Install it via pip: pip install morsedecode Alternatively, you can use pipx to run

Utility to add/remove licenses to/from source files
Utility to add/remove licenses to/from source files

Utility to add/remove licenses to/from source files. Supports processing any combination of globs, files, and directories (recurse). Pruning options allow skipping non-licensing files.

EVE-NG tools, A Utility to make operations with EVE-NG more friendly.
EVE-NG tools, A Utility to make operations with EVE-NG more friendly.

EVE-NG tools, A Utility to make operations with EVE-NG more friendly. Also it support different snapshot operations with same style as Libvirt/KVM

Comments
  • __test_dummy_source__.o is generated at cwd

    __test_dummy_source__.o is generated at cwd

    When you capture a program, __test_dummy_source__.o is generated at cwd.

    $ cxbuild capture gcc -o test test.c
    
    $ ls
    __test_dummy_source__.o test test.c
    
    

    This is not intended behavior. This should be fixed.

    bug 
    opened by HansolChoe 0
  • bug: compile_commands.json may have duplicate data

    bug: compile_commands.json may have duplicate data

    compile_commands.json may have duplicate data(not entirely duplicated because command is slightly different, but compilation target file is same) like below:

    [
        {
            "directory": "/home/workspace/src",
            "command": "gcc -D HAVE_CONFIG_H -I /home/workspace/src -g -O2 -std=gnu99 -Wall -Wextra -Wdeclaration-after-statement -Wpointer-arith -funsigned-char -D _FORTIFY_SOURCE=2 -Wcast-align -Wcast-qual -Wshadow -Wbad-function-cast -Wwrite-strings -Wundef -Wuninitialized -Winit-self -Wnested-externs -Wstrict-prototypes -Wmissing-prototypes -Wmissing-declarations -Waggregate-return -pipe -MT /home/workspace/src/g72x.lo -MD -MP -MF .deps/g72x.Tpo -c -fPIC -D PIC -o /home/workspace/src/.libs/g72x.o -I /usr/lib/gcc/x86_64-linux-gnu/7/include -I /usr/local/include -I /usr/lib/gcc/x86_64-linux-gnu/7/include-fixed -I /usr/include/x86_64-linux-gnu -I /usr/include /home/workspace/src/g72x.c",
            "file": "/home/workspace/src/g72x.c"
        },
       ...
        {
            "directory": "/home/workspace/src",
            "command": "gcc -D HAVE_CONFIG_H -I /home/workspace/src -g -O2 -std=gnu99 -Wall -Wextra -Wdeclaration-after-statement -Wpointer-arith -funsigned-char -D _FORTIFY_SOURCE=2 -Wcast-align -Wcast-qual -Wshadow -Wbad-function-cast -Wwrite-strings -Wundef -Wuninitialized -Winit-self -Wnested-externs -Wstrict-prototypes -Wmissing-prototypes -Wmissing-declarations -Waggregate-return -pipe -MT /home/workspace/src/g72x.lo -MD -MP -MF .deps/g72x.Tpo -c -o /home/workspace/src/g72x.o -I /usr/lib/gcc/x86_64-linux-gnu/7/include -I /usr/local/include -I /usr/lib/gcc/x86_64-linux-gnu/7/include-fixed -I /usr/include/x86_64-linux-gnu -I /usr/include /home/workspace/src/g72x.c",
            "file": "/home/workspace/src/g72x.c"
        },
      ...
    ]
    

    The command below does not have -fPIC -D PIC but other data are same. This is confusing.

    This symptom can be reproduced by cxbuild captured command at root of this file(fixed.1.zip). It is normal that .xdb/artifacts.zip is empty.

    bug 
    opened by HansolChoe 0
  • Some paths are not captured properly (not absoulte path)

    Some paths are not captured properly (not absoulte path)

    In some cases, paths in projects.json are not recorded in absolute path.

    If you capture the build information of gzip-1.11, you may see logs like below:

    $ ./configure && cxbuild capture make -j10
    ...
    > eXtension of Compilation Database
    -- [Analyzing build Activities]
    -- [Build trace preprocessed]
    -- [project.json written [/home/vagrant/workspace/gzip-1.11/.xdb/project.json]]
    -- [compile_commands.json written [/home/vagrant/workspace/gzip-1.11/.xdb/compile_commands.json]]
      > All dependency should start with '//'. Ignore 'threadlib.c'.
      > All dependency should start with '//'. Ignore 'lock.c'.
    -- [artifacts.zip written [/home/vagrant/workspace/gzip-1.11/.xdb/artifacts.zip]]
    

    threadlib.c and lock.c are not absolute path. In this case, cxbuild is not killed, but it prints warning messages and ignores the two files.

    These files may be okay to ignore, but we need to know when these things happen.

    bug 
    opened by HansolChoe 1
Releases(dpp)
  • dpp(Nov 3, 2022)

    Commits

    • clang-11 no such compiler error (vulcan-action)
    • artifact.zip except for library(.so etc) file (vulcan-action)
    • 6c466f1: Fix get_system_include_list and get_predefined_macro (HansolChoe)
    • 48bf810: Fix _ArtifactBuilder to use the function in cslib (HansolChoe)
    • f7ef04f: Fix _gnu_compiler_tool's compiler setting and pytest (HansolChoe)
    • b975946: Remove is_debug_mode call csutil (HansolChoe)
    • 1e0b592: Fixed _ArtifactBuilder to collect source or header file (Yeongcheol Kim)
    • 51a75a5: Revised to not remove artifacts directory (Yeongcheol Kim)
    • 1eb568b: Revised to check CXBUILD_OUTPUT_DIR environment variable and set as working_dir (Yeongcheol Kim)
    • afd9e0b: cc를 compiler로 인식하도록 패턴 추가 (HansolChoe)
    • 72f8b8e: cc를 command로 인식하도록 추가 (HansolChoe)
    Source code(tar.gz)
    Source code(zip)
    Release-cxbuild-Ubuntu-18.04.tar.gz(10.29 MB)
Owner
GLaDOS (G? L? Automatic Debug Operation System)
GLaDOS (G? L? Automatic Debug Operation System)
A simple, console based nHentai Code Generator

nHentai Code Generator A simple, console based nHentai Code Generator. How to run? Windows Android Windows Make sure you have python and git installed

5 Jun 02, 2022
ticktock is a minimalist library to view Python time performance of Python code.

ticktock is a minimalist library to view Python time performance of Python code.

Victor Benichoux 30 Sep 28, 2022
A string to hashtags module

A string to hashtags module

Fayas Noushad 4 Dec 01, 2021
Find version automatically based on git tags and commit messages.

GIT-CONVENTIONAL-VERSION Find version automatically based on git tags and commit messages. The tool is very specific in its function, so it is very fl

0 Nov 07, 2021
A simple tool to extract python code from a Jupyter notebook, and then run pylint on it for static analysis.

Jupyter Pylinter A simple tool to extract python code from a Jupyter notebook, and then run pylint on it for static analysis. If you find this tool us

Edmund Goodman 10 Oct 13, 2022
Two fast AUC calculation implementations for python

fastauc Two fast AUC calculation implementations for python: python-based is approximately 5X faster than the default sklearn.metrics.roc_auc_score()

Vsevolod Kompantsev 26 Dec 11, 2022
PyGMT - A Python interface for the Generic Mapping Tools

PyGMT A Python interface for the Generic Mapping Tools Documentation (development version) | Contact | Try Online Why PyGMT? A beautiful map is worth

The Generic Mapping Tools (GMT) 564 Dec 28, 2022
JavaScript-style async programming for Python.

promisio JavaScript-style async programming for Python. Examples Create a promise-based async function using the promisify decorator. It works on both

Miguel Grinberg 191 Dec 30, 2022
Dependency injection lib for Python 3.8+

PyDI Dependency injection lib for python How to use To define the classes that should be injected and stored as bean use decorator @component @compone

Nikita Antropov 2 Nov 09, 2021
Tool for generating Memory.scan() compatible instruction search patterns

scanpat Tool for generating Frida Memory.scan() compatible instruction search patterns. Powered by r2. Examples $ ./scanpat.py arm.ks:64 'sub sp, sp,

Ole André Vadla Ravnås 13 Sep 19, 2022
Backup a folder to an another folder by using mirror update method.

Mirror Update Backup Backup a folder to an another folder by using mirror update method. How to use Install requirement pip install -r requirements.tx

1 Nov 21, 2022
Michael Vinyard's utilities

Install vintools To download this package from pypi: pip install vintools Install the development package To download and install the developmen

Michael Vinyard 2 May 22, 2022
A sys-botbase client for remote control automation of Nintendo Switch consoles. Based on SysBot.NET, written in python.

SysBot.py A sys-botbase client for remote control automation of Nintendo Switch consoles. Based on SysBot.NET, written in python. Setup: Download the

7 Dec 16, 2022
A functional standard library for Python.

Toolz A set of utility functions for iterators, functions, and dictionaries. See the PyToolz documentation at https://toolz.readthedocs.io LICENSE New

4.1k Dec 30, 2022
✨ Un pierre feuille ciseaux totalement fait en Python par moi, et en français.

Pierre Feuille Ciseaux ❗ Un pierre feuille ciseaux totalement fait en Python par moi. 🔮 Avec l'utilisation du module "random", j'ai pu faire un choix

MrGabin 3 Jun 06, 2021
A tool written in python to generate basic repo files from github

A tool written in python to generate basic repo files from github

Riley 7 Dec 02, 2021
aws ec2.py companion script to generate sshconfigs with auto bastion host discovery

ec2-bastion-sshconfig This script will interate over instances found by ec2.py and if those instances are not publically accessible it will search the

Steve Melo 1 Sep 11, 2022
Small project to interact with python, C, HTML, JavaScript, PHP.

Micro Hidroponic Small project to interact with python, C, HTML, JavaScript, PHP. Table of Contents General Info Technologies Used Screenshots Usage P

Filipe Martins 1 Nov 10, 2021
A workflow management tool for numerical models on the NCI computing systems

Payu Payu is a climate model workflow management tool for supercomputing environments. Payu is currently only configured for use on computing clusters

The Payu Organization 11 Aug 25, 2022
Compute the fair market value (FMV) of staking rewards at time of receipt.

tendermint-tax A tool to help calculate the tax liability of staking rewards on Tendermint chains. Specifically, this tool calculates the fair market

5 Jan 07, 2022