* MatchCompiler: Neaten error messages
Especially the added space makes it a little more readable.
* MatchCompiler: Add spaces between operators
* Matchcompiler: Don't bailout if non-const pattern
If matchcompiler found a call to Token::Match() or Token::simpleMatch()
with an unknown string argument, subsequent calls to Token::Match() or
Token::simpleMatch() on the same line would not be processed by
matchcompiler.
To fix this, keep track of the last index we found a match, and update
it accordingly when the line is modified. To avoid having to keep track
of if "Match" or "simpleMatch" is the first match we find, just make a
loop over them.
In tools/, it did not understand that some files are generated by
matchcompiler.py.
```
CMake Error at tools/CMakeLists.txt:7 (add_executable):
Cannot find source file:
</some/path>/cppcheck/build/lib/build/mc_pathmatch.cpp
Tried extensions .c .C .c++ .cc .cpp .cxx .cu .m .M .mm .h .hh .h++ .hm
.hpp .hxx .in .txx
CMake Error at tools/CMakeLists.txt:7 (add_executable):
No SOURCES given to target: dmake
```
Co-authored-by: Ken-Patrick Lehrmann <kp.lehrmann@gmail.com>
* cleaned up compiler options related code in CMake
* moved cmake_minimum_required() and raised to latest 2.8.x version
* use proper compiler version check / print compiler version
* fixed linking of sanitized builds
* added proper version checks to newer Clang warnings and enabled them / moved tinyxml_objs flags to proper compiler
* disabled -Wdeprecated-declarations for Clang
* compileroptions.cmake: removed unnecessary check for clang++ existence - CMAKE_CXX_COMPILER_ID is determined by CMake
* printInfo.cmake: removed unnecessary message for ANALYZE_ADDRESS - LSAN is part of ASAN and enabled by default
* cleaned up if() comparisons in CMake
* added/adjusted TODOs
Sometimes it could happen that SIGSEGV is thrown when Cppcheck is killed
because of a timeout. Then the execution is wrongly handled as a crash
and debugged with gdb instead of marking it as timed out.
This fixes that issue by checking the time out before checking sig_num.
* daca2: Improve package sorting using natsort
This switches the external dependency from semver to natsort, and
improves comparison of packages where one or more of the packages do not
use semantic versioning (major.minor.patch).
This also makes daca2-download and daca2-getpackages work with python 3.
In theory, they should work with python 2 as well, but I have not tested
it.
* Make daca2 scripts executable
* Update hashbangs to python3
* Update usage description
To avoid specifying python version in the usage description, just
show how to execute the scripts and leave the rest to the shebangs.
* No need to specify python version in start_donate_cpu_server_test_local.sh
Leave it to the hashbang instead.
This adds a timeout of 60 minutes for the Cppcheck analysis.
Timed out results do not count as crash but they are uploaded and
marked with "TO!" in the list of the latest results. No "diff" is
generated for timed out results so they do not add wrong entries to
the "Diff report".
In test-my-pr.py the timed out results are listed separately just like
the crashes.
donate-cpu-server.py: Add timeout report
Using .tar.xz packages adds about 4500 additional packages that can be
tested and changes many existing packages where a more recent version
can be used now that is only available as .tar.xz file.
Related ticket: https://trac.cppcheck.net/ticket/9508
donate-cpu.py: Require at least Python 3.4
xz support was added with 3.3.
There were two issues:
1. The version was not correctly extracted out of the filename. When
extracting a sub-string in Python one has to specify start index and
end index instead of start index and length.
2. The function `semver.cmp()` does nothing useful. Instead the function
`semver.compare()` must be called when two version should be compared.
See https://github.com/python-semver/python-semver/issues/117#issuecomment-479188221
Because `semver.compare()` now really compares the versions it is
possible that an exception is thrown if a version is not in the semver
version format. In such cases the sorting is aborted and the last
filename in the array is returned. This is often but not always already
the latest version from what I have seen.
* triage: Allow master as version
Log-files from test-my-pr.py shows "master" as version. Extend regexp to
match "master", and improve regexp handling slightly to avoid making
assumptions on the length of the version.
* triage: Show log-files when opening files
test-my-pr.py defaults to save output as "my_check_diff.log". Show
log-files by default to make it more convenient to check these files as
well.
Previously, calling test-my-pr with a relative work-path resulted in a
crash when trying to create the result file (due to the change of
current working directory).
* donate_cpu_lib: Fix python 3 crash if fail to get package
Decoding a string is not allowed in python 3 (in python 2 it works).
If fetching the package fails, assign an empty byte string instead to
avoid crashing.
* Initialize package instead
Yesterday, I observed that some client with a wrong jobs setting
(only "-j") requested one package after another and always uploaded
results where it only said that the argument "-j" is invalid for
Cppcheck.
This check should avoid such cases where results are overwritten with
useless data and the server is kept busy for nothing.
* regex for version
* fields names improved
* codestyle
* m prefix for fsmodel
* string duplication removed
* find in files: show all entries
* spaces
* added hint to checkboxes; element naming fixed
* layout naming improvement
* spacing 6->1
* openssl.cfg: Add OpenSSL library configuration with tests
Reference: https://www.openssl.org/docs/man1.1.1/man3/
* openssl.cfg: Add some configurations for EVP functions
Add alloc/dealloc configuration for EVP_CIPHER_CTX_new and
EVP_CIPHER_CTX_free.
Add configuration for encryption functions that are used in example code
which is added to the tests.
* donate-cpu-server.py: Use tools to prepare code to work with Python 3
The following commands were used for these changes:
futurize -1 -w donate-cpu-server.py
2to3 -w donate-cpu-server.py
* Make the server work under Python 3
Manually fixed the Unicode issues. Received data is decoded, sent data
is encoded.
* Add backward compatible type hints (in comments)
This enables better static analysis and suggestions in an IDE.
* Fix Pylint warning "Comparison to literal"
* .travis.yml: Fix/enhance pylint verification and Python compilation
donate-cpu-server.py is only Python 3 compatible, so it must be ignored
for pylint verification under Python 2.
All Python scripts that were verified with pylint under Python 2 are
now also verified with pylint under Python 3.
* donate-cpu-server.py: Add shebang and mark script as executable
* start_donate_cpu_server_test_local.sh: Directly execute server
Since the server script is executable now and has a shebang it can
be directly executed.
* Use Python 3.0 function annotations instead of comment type hints
Reference: https://www.python.org/dev/peps/pep-3107/
* libsigc++.cfg: Add configuration for library libsigc++
Reference: https://libsigcplusplus.github.io/libsigcplusplus/
* Make code compatible with libsigc++-2.0 instead of 3.0
Since Version 3.0 C++14 is required which is not (fully) supported in
some older GCC versions.
Fall back to "~/daca@home" if "/var/daca@home" does not exist.
Print the used work path when the script starts.
This way we do not have to change the server script before uploading
it to the server while being fully backwards compatible.
Check if "python" is available, if not check for "python3" and use
the available Python interpreter. If no Python interpreter is found,
"make" fails with an according error message.
This solves the issue that not all modern Linux distributions any longer
install Python 2 by default, so "python" is not available and
"make MATCHCOMPILER=yes" would fail. Instead of forcing the users to
install Python 2, Python 3 is used in such a case now if it is
available.
* reduce.py: Allow reducing error messages, print output in case of error
Allow reducing code that triggers (false positive) error messages.
Print Cppcheck output in case Cppcheck returns unsuccessfully and no
segfault is expected. This helps fixing messed up command lines (for
example issues with the path).
* Use "else" as suggested
* donate-cpu.py: Add internal timing information of Cppcheck to output
The option "--showtime=top5" is added to the Cppcheck command line.
The timing output is collected and only for HEAD it is shown in the new
category "head-timing-info" in the results output.
The timing output is indented with one white space, so even in the
unlikely case that a function is named "head result:" or "diff:" it does
not break the parser in the server.
* donate-cpu.py: Also print the "old" timing information for comparison
Some projects only use this (older?) style of Qt header inclusion.
There are (older) books and examples which use this style, too.
It seems to be perfectly valid, so we should support it.
Previously, external files were not searched at all, and dependencies
on header files in cli was not taken into account for test files.
To add dependency of headers in externals, we also need to search for
includes with angular brackets.
There is no point in checking which libraries to use for each cppcheck
version since there is no change. Refactor the checking to a separate
function and run that once instead. This halves the time it takes to
check for libraries.
I looked into many packages where the detection failed and they all use
`#include "ruby.h"`. Some of these packages seem to be Ruby modules,
others seem to be "normal" software.
This adds one line in the package report to show the git hash and commit
date. This makes it possible to see exactely which revision the result
was obtained with.
The cppcheck head info line is now shown as
head-info: 1a25d3f9e (2019-08-30 18:34:14 +0200)
If there are *.diff files with old version numbers the server script
crashed because it always expects a key with the current OLD_VERSION.
This fix ignores entries in *.diff files that are not made against the
current OLD_VERSION.
Check if fetching and updating the cppcheck sources are successful. If
not successful after five retries, try removing the existing clone and
checkout again.
* dmake: Refactor object files to separate function
No functional change.
* dmake: Use range for loops
No functional change.
* Add all external cpp files instead of open coding
No functional change.
* Remove duplicate check.h in lib.pri HEADERS
* Add missing newline
No functional change, but the readability of the generated Makefile is
slightly improved.
Since the number of test files is larger than the number of lib files,
this only caused an extra harmless '\' being printed after the last
header file in lib.pri. If the number of test files would have been
smaller than the number of lib files, the generated lib.pri would have
been broken.
Sometimes there are no relevant source files (.c, .cpp, ...) extracted,
but other files are (.h, ...).
There could be only header files for example. Then Cppcheck returns with
exit code 1 and prints an error message. This is no crash and now no
longer reported as such.
Use renamed pylintrc file that is only meant for Travis checks.
Check all Python scripts in 'addons', 'htmlreport' and 'tools'
Errors for `_socketobject` class are disabled, see
https://stackoverflow.com/questions/10300082/how-to-prevent-python-pylint-complaining-about-socket-class-sendall-method
Install imported modules `unittest2` and `pexpect` via pip.
Add "./addons" to search-path for modules because
"tools/compare-ast-clang-and-cppcheck.py" imports cppcheckdata.py from
addons. Pylint does not seem to evaluate
`sys.path.insert(0, '../addons')` in the script. So an `init-hook` is
necessary in pylintrc_travis.
* donate-cpu.py: treat signal 6 (SIGABRT) as crash as well so we get a stack trace in the result
* donate-cpu.py: simplified returncode/signal check / also generate stack traces for SIGILL, SIGFPE, SIGBUS
* donate-cpu.py: avoid usage of "not" in if
* donate-cpu.py: do not overwrite returncode in crash handling
Trac ticket: https://trac.cppcheck.net/ticket/9192
This commit also fixes that negative values of the elapsed time are
used for calculating total times. These crashes and errors are now
ignored in the time report since there is no useful timing information
in that case.
Tested with a local daca@home server with old and new results.
Sources were built with Clang but with increased verbosity of error detection.
A number of syntax and semantic warnings were encountered. Commit adds
changes to correct these warnings.
Some changes involve removing extra, and unncessary, semi-colons at EOL
(e.g. at end of switch clause).
Project astyle settings are not currently setup to detect if a file is to
have an extra carriage return after the last line of data. Two files were
altered to ensure an extra carriage return.
An advisory to enhance code was encountered in triage code. Clang advisory
on a for-loop interation value suggested that:
`use reference type 'const QString &' to prevent copying`
Building on #1874, commit adds user controls to choose
or edit style in cppcheck-gui ONLY. Commit does not
address CodeEditor style usage in triage app at this time.
Code Editor style can be altered from the added "Code Editor"
tab in the user preferences. The user has the option to select
default light, default dark, or to customize.
If user leaves the style set to light or dark defaults, this
will be reflected in the choices shown in the preferences
dialog.
User choice for Code Editor Style is saved in the cppcheck-gui
preferences under the heading "EditorStyle".
* build: remove -Wabi and add -Wundef
gcc >= 8 throws a warning about -Wabi (without a specific ABI version)
being ignored, while -Wundef seems more useful (as shown by the change
in config.h, which was probably an unfortunate typo)
travis.yaml should probably be updated soon, but was left out from this
change as the current images don't yet need it
* lib: unused function in valueflow
refactored out since 8c03be3212
lib/valueflow.cpp:3124:21: warning: unused function 'endTemplateArgument' [-Wunused-function]
* readme: include picojson
* make: also clean exe
Packages now can contain something like:
```
head results:
Checking temp/openvdb/Platform.cc: __GNUC__=1...
[New Thread 7892.0x91c]
```
"New Thread 7892.0x91c" was wrongly identified as messageId in the HEAD
report.
This commit adds code to skip lines that start with `[` or where the
messageId contains at least one space.
* donate-cpu.py: made exitcodes > 0 negative so they will be detected a crash / changed the ThreadExecutor error to -222
* donate-cpu.py: unconditionally upload results and info now that errors are properly handled - will also properly clear the result/info in case there are no more messages
* donate-cpu.py: bumped version
* donate-cpu.py: added stdout to output in case of exitcode != 0
* donate-cpu.py: do not scan packages with no relevant files
* donate-cpu.py: bumped version
If an upload fails, the reason (exception text) is now printed.
Fix: If the last retry failed do not wait until continuing.
Remove some obsolete "fast" code in the uploadResults() function.
Tested with Python 2.7.16 and Python 3.6.8.
Since the directory for the results does no longer exist on the server,
the server currently crashes every time older clients try to upload
experimental fast results via "write-fast" command.
Now this command is just ignored so the server is instantly ready
again after a "write-fast" command.
* donate-cpu.server.py: increased "Package" column width for latest report and small cleanup
* donate-cpu.server.py: added date and time to crash report
* donate-cpu.server.py: simplified strDateTime()
* donate-cpu.server.py: add stale report to show results which are older than 30 days
* donate-cpu-server.py: added version and some logging
* threadexecutor.cpp: streamlined error messages
* donate-cpu.py: detect additional signals and exitcode != 0 as crash as well and (ab)use elapsedTime to make the errorcode visible in the output / also detect ThreadExecutor issues
* donate-cpu.py: bumped version
* donate-cpu.py: fixed detection of ThreadExecutor errors
* Get stack traces for daca@home crashes
If a command in daca@home crashes, execute it again within gdb to get a stack trace.
* donate-cpu.py: added "gdb" to checkRequirements()
* donate-cpu.py: handle wget failures
* donate-cpu.py: added --no-upload option to disable all uploads
* donate-cpu.py: set max_packages to 1 if --package is provided to avoid endless processing of the same package
* donate-cpu.py: no longer treat missing sources as a crash
* donate-cpu.py: fixed wget "http://: Invalid host name." error caused by empty argument in subprocess.call()
* donate-cpu.py: added --no-upload to --help
* donate-cpu.py: detect crashes when using -j1
* donate-cpu.py: added -g to compiler flags
* donate-cpu.py: fixed gdb call and stacktrace printing / always pass "-j1" to gdb call so the exception will actually occur in the application
* donate-cpu.py: removed left-over --verbose from wget call
* donate-cpu.py: removed unnecessary break
* donate-cpu.py: only use gdb for crash in head run / actually provide the stack trace for the output
* donate-cpu.py: include the last checked file with the stack trace
* donate-cpu.py: removed unnecessary wget() call and a sleep in it / also inverted some logic
* donate-cpu.py: small hasInclude() optimization
* donate-cpu.py: bumped version number
* donate-cpu.py: detect start of gdb output when Cygwin is used
The Cygwin output looks like this:
Thread 1 "cppcheck" received signal SIGSEGV, Segmentation fault.
Co-Authored-By: firewave <firewave@users.noreply.github.com>
The official documentation recommends to include the Python C API via
`#include "Python.h"`:
https://docs.python.org/3/c-api/intro.html
And many projects do it exactly this way, that is why the client script
often does not detect the usage of the Python C API.
The client script will exit after the specified number of packages
have been processed. 0 means infinitely.
Useful for example to regularly quit the script, check for updates to
the client and start it again. Or as an alternative to the `--stop-time`
argument.
The function `iteritems()` of `dict`s is deprecated. The recommended
alternative is to use `items()`, this function also works with Python 2.
The next issue is that lambdas can no longer unpack tuple parameters
in Python 3. It would be possible to use some workaround and still use
a lambda, but using `operator.itemgetter(1)` instead is faster and the
recommended method in such a case.
The syntax is now compatible with Python 2 and 3 but the server script
still does not work with Python 3. For example `socket.recv()` returns
`bytes` in Python 3 and `str` in Python 2. Currently `str` is expected
so it does not work with Python 3.
In my tests there were about 1500 additional packages
available as bz2 on the server.
For some packages a newer version is now used if it is
only available as .tar.bz2 archive.
The donate-cpu.py client is tested to work with .tar.bz2
files under Python 2.7.15 and 3.6.8.
Python 3 directly decodes the text when it is read(). If there is any
invalid UTF-8 character in the text an exception is thrown (IIRC it is
UnicodeDecodeError). Opening the file with `error='ignore'` avoids
throwing an exception and just ignores the invalid character. Since
this is only possible since Python version 3 there must be extra code
for older versions.
The test script has been enhanced. It now also uses a package which
contains a file with at least one invalid UTF-8 character.
Now also found:
- Includes directly at the beginning of a file
- Indented includes
- Includes where there is no white-space between
"include" and header name
The function libcerror_error_set() is currently the function for which
daca@home most often reports a missing configuration (more than 80000
times).
Official repository of libcerror: https://github.com/libyal/libcerror
The library configuration has been tested with the library libvhdi:
ftp://ftp.se.debian.org/debian/pool/main/libv/libvhdi/libvhdi_20181227.orig.tar.gz
This detects more includes / headers. For example includes like
"# include <gtk/gtk.h>" with a space before "include" as it is used in
the package http://cppcheck.osuosl.org:8000/gbatnav are now also
detected.
The regex search also searches all includes for one library in one go
instead of one include per loop.
Tested with several packages to make sure libraries that were detected
before are still detected.
If "head" and "OLD_VERSION" both crash there are no messages and the
variable `results_exist" is set to False. But still the results must be uploaded
for the crashes to be saved also.
Tested with the package http://cppcheck.osuosl.org:8000/double-conversion
Ignore normal results (not fast or info) where the diff was made against the wrong OLD_VERSION. This avoids unwanted results when some client still analyzes an old package but the OLD_VERSION in the server script is changed for example.
Results missing the Cppcheck version info completely are also ignored.
Tested locally with correct and wrong version numbers.
Only enable the library option for those configurations if the
corresponding .cfg file exists to not crash Cppcheck if an
older version than 1.87 is used as the "old" version.
Two logging handler are added. One just prints all output with at least INFO severity to the console. The other only prints ERROR severity and above to a rotating file. The file size is limited to 100 kB. Since one backup file is used that results in a maximum of 200 kB disk usage.
The log file is saved in the directory where the server script is.
Hopefully this way some issues can be found more easily.
Tested locally.