eddyxu / cpp-coveralls Goto Github PK
View Code? Open in Web Editor NEWUpload gcov results to coveralls.io
License: Apache License 2.0
Upload gcov results to coveralls.io
License: Apache License 2.0
for my fix for issue #77 I forgot to update the README
I would like to exclude Qt moc_* files from the coverage report and tried many way to do it without success:
This is particularly useful in environments such as travis-ci as you can trivially set secure env variables
coveralls --gcov-options -q
is interpreted as passing -q
to coveralls
coveralls --gcov-options q
passes the argument q
to gcov
Nice script! 👍
When setting up a new project, it would be really useful if I could see which files are being included / excluded without having to upload to coverall.io everytime. Currently I'm doing something like:
coveralls [various options] --verbose | tr ',' '\n' | grep "'name':" | sort
This outputs a list of all filenames being included. But some sort of include/exclude indication in the non-verbose output seems more appropriate.
The approach above also still results in a (failed) attempt to upload to coverall.io each time, which takes a few seconds to timeout. It would be great if I can specify some sort of --dryrun
option which allows me to do the full run right up to, but not including, the actual upload.
Thanks! :)
I've tried to add coveralls to nijel/enca repo, but cpp-coveralls fails to find some files:
https://travis-ci.org/nijel/enca/jobs/10533670#L939
Not sure if the build system is that special there, but it looks for lib/data/chinese/zh_weight_big5.h
instead of data/chinese/zh_weight_big5.h
.
When you have project written with several languages the only way send full coverage report is chaining several tools.
For example you can see my test project.
In this project I generate json file with coveralls --dump
and then use my luacov-coveralls
module to append this file and then send report using curl
.
But may be you can add ability append existing json files because C/C++ is more popular then Lua.
I noticed that cpp-coveralls' PyPI doesn't contain the software.
$ pip install cpp-coveralls
Downloading/unpacking cpp-coveralls
Could not find any downloads that satisfy the requirement cpp-coveralls
No distributions at all found for cpp-coveralls
Storing complete log in /Users/myint/.pip/pip.log
You can fix this by doing something like,
python setup.py register sdist upload
I'm guessing that you missed the upload
part.
It seems that the exclusion markers (LCOV_EXCL_LINE, ...) are broken. In a project of mine, each occurrence is ignored, see:
https://coveralls.io/builds/5563158/source?filename=src%2Flib%2Fpostgres%2Ftransaction.cpp
The utility assumes that the .c files and the git repository are in the same tree as the .gcno files (-r option).
With cmake, it is quite easy (and common) to build utilities in a separate "build" directory.
I think a --source-root option would make sense.
Getting the same issue as #24, can't seem to find why. Running coveralls on local machine works fine but fails on travis-ci
Traceback (most recent call last):
File "/usr/local/bin/coveralls", line 9, in <module> load_entry_point('cpp coveralls==0.1.4', 'console_scripts', 'coveralls')()
File "/usr/local/lib/python2.7/dist-packages/coveralls/__init__.py", line 68, in run cov_report = coverage.collect(args)
File "/usr/local/lib/python2.7/dist-packages/coveralls/coverage.py", line 257, in collect for line in fobj:
File "/usr/lib/python2.7/codecs.py", line 296, in decode (result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf8' codec can't decode byte 0xe8 in position 1276: invalid continuation byte
Hi all,
I am just experimenting cpp-coveralls with a Fortran project.
On my local Linux box it works great, but when it has been used on travis-ci no data are uploaded to the coveralls.io server.
Project details:
home: https://github.com/szaghi/BeFoR64
travis.yml: https://github.com/szaghi/BeFoR64/blob/master/.travis.yml
travis log: https://travis-ci.org/szaghi/BeFoR64/builds/52415236
coveralls log: https://coveralls.io/r/szaghi/BeFoR64
As you can see, some previous uploads to coveralls have been done correctly (e.g. the one with 89% coverage), but these have been done directly from my box with the token passed explicitly with -t
.
I suppose that the problem is related to the use of -b
switch. Can anyone help me to fix this problem?
Thank you very much for your great cpp-coveralls.
Since version 0.3.3, installing cpp-coveralls on Travis-CI doesn’t work anymore. It used to work fine with previous versions.
It looks like a packaging problem to me but I’m not familiar enough with python package management to fully understand what’s happening.
Traceback (most recent call last):
File "/usr/local/bin/coveralls", line 9, in <module>
load_entry_point('cpp-coveralls==0.1.4', 'console_scripts', 'coveralls')()
File "/usr/local/lib/python2.7/dist-packages/coveralls/__init__.py", line 68, in run
cov_report = coverage.collect(args)
File "/usr/local/lib/python2.7/dist-packages/coveralls/coverage.py", line 257, in collect
for line in fobj:
File "/usr/lib/python2.7/codecs.py", line 296, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf8' codec can't decode byte 0xe9 in position 89: invalid continuation byte
Any idea why that could be?
When using the build root option I have the following error message:
gcov: for the -object-directory option: may only occur zero or one times!
I investigate the issue and after adding several log to coveralls.py, I realized that it add option --object-directory was added several times :
https://github.com/eddyxu/cpp-coveralls/blob/master/cpp_coveralls/coverage.py#L150-L151
The coverage is ok for the first file parsed but after that it drops to 0% since gcov fails...
Are you sure about your implementation of the build root option?
I'll send you a PR.
Hello,
For an autotools based project with non recursive makefile.am (i.e no SUBDIRS=) and AUTOMAKE_OPTIONS= subdir-objects the cpp-coveralls is not working.
Basically the structure of the project is like:
src/lib/foo.c
src/lib/.libs/src_lib_foo.o
src/lib/.libs/src_lib_foo.gcno
After running a test application
src/lib/.libs/src_lib_foo.gcda
In order to make it work with cpp-coveralls I had to modify the code to call gcov this way:
subprocess.call('%s -o %s %s.o' % (args.gcov, root, basename), shell=True)
That is, not do a cd to the folder where the file resides but use the gcov parameters for it as it will find the objects/sources correctly.
Then, when looking on the source files on the collect function, the .gcov file has the source referenced realtive to the top source dir given that it is a non recursive make
0:Source:src/lib/foo.c
But on the collect function the base dir is always prepended, so the source file ends being
src/lib/src/lib/foo.c
which does not exist of course.
I understand there are several ways to build a gcov based project, so maybe instead of handling them all, let the project itself generate the gcov files and allow the cpp-coveralls to just merge the results and send them via json through a command line option for example?
Hi
I followed your instructions to run the code coverage on github, but it doesn't work.
https://travis-ci.org/crondaemon/dines/builds/33904246
I made many tries, but no luck. If I run cpp-coveralls from my local copy, it works fine.
Any Idea?
I'm trying to use your great tool in a cmake based C++ project and I have couple of issues. After several attempts, I think I've identified an issue: Does "coveralls" scan the subdir recursively for "gcno/gcda" files ?
I don't think so.. Consider the following files:
rootfolder/
prog1.cpp
hop.h
and a subset of the .travis.yml
script:
- mkdir aa && cd aa && g++ -coverage -o prog2 ../prog1.cpp && ./prog2 && cd ..
- ls -l && ls -l aa/
after_success:
- coveralls --verbose
In the aa/
folder, prog1.gcda and prog1.gcno exist. But running coveralls
in the root folder does not send anything.
(if I compile/run the prog in the same folder as its sources, it's all fine)
If you want to have a look, my toy project can be found here: https://github.com/dcoeurjo/testcov
Hi, i've started using this great proyect some weeks ago. It's a great proyect very easy to use!.
Since last sunday (31/5), coveralls stoped showing me last code coverage updates. All uploads are marked as if there are no valid files (build 77, build 78, build 79, build 80, build 81, build 82, build 83 ). I've not change cpp-coveralls
command line configuration, so i really don't know if it's coveralls fault, mine o other thing.
I'm uploading data using this .travis.yml
, bassically the line executed is this:
coveralls -i src -i test -e src/coverage.cpp -E '.*\/mongoose\/mongoose\.[hc]' -E '.*CMakeFiles/feature_tests\..*' -E '.*CMakeCXXCompilerId\.cpp' -E '.*CMakeCCompilerId\.c' -E '.*\/test\/.*\.(h|hpp|cpp)$' -r ./
The proyect is usign CMake
, under the folder build
the main program is built and in the folder test
test binaries are built. All the binaries are compiled with gcov flags and executed.
Lastest build is this one and the last line of the output of cpp-coveralls
is this:
{u'url': u'https://coveralls.io/jobs/6323819', u'message': u'Job #82.1'}
If i execute the command in my machine and dump cpp-coveralls
to a json i get this output. And the last line of cpp-coveralls
is this one:
{u'url': u'https://coveralls.io/jobs/6323949', u'message': u'Job #83.1'}
As far as i understand all the files are in that json. If i upload the result locally i get the same problem with no valid files.
The coverall page of the proyect is: https://coveralls.io/r/NickCis/7552-taller-prog-2-2015-C1
Thanks in advance!
You should add tags for your different versions (e.g. v0.1.2
). This makes the life of Linux packagers easier :-)
I am running cpp-coveralls from a shell command in my Jenkins job, but every upload made by Jenkins has "HEAD" as it's branch on coveralls.io, whereas the ones I do manually (with the same command line) correctly capture the repository current branch.
This must have something to do with the way Jenkins checks out the repository, my question is if you or anyone else has had this problem as well and have some workaround for it?
Thank you,
Currently cpp-coveralls will still send data to coveralls even when no token is defined, which causes coveralls to reject the send (as it cannot associate it with a project).
I think cpp-coveralls should throw an exception just before it would send the report if there is no coveralls token defined, maybe this functionality could be hidden behind a flag.
This is because it took me a while to realise I needed to set a token, I would much prefer for it to throw an exception and explain how to resolve it.
I am happy to do this work if upstream will accept it.
Hello.
How can I build and install coveralls to user-specified directory? I haven't seen such options on install script.
Moreover what should I do for using coveralls on travis-ci with container-builds? With containers I can't run sudo easy_install3 cpp-coveralls
anyway, because sudo is disabled.
Thanks.
I am using the merge functionality from https://github.com/coagulant/coveralls-python to get c code and python code reported at coveralls.
The problem I am facing is that pip install cpp-coveralls
& pip install coveralls
are called coveralls from the command line, I believe, I need both packages to get c and python code covered together, but since both are called through the same name makes the whole a bit inconvenient, does anybody see a solution for this?
How I can set service_name
argument?
I try using coveralls from codeship
and service_name
set to travis-ci
.
We use both the cpp-coveralls
and coveralls
packages for the Astropy project (http://www.astropy.org). However, both install a script named coveralls
, which is confusing because they both accept different options. This leads to issues if cpp-coveralls
is installed after coveralls
because the coveralls
command then hides the 'real' coveralls command.
Is there a reason why you need to install both coveralls
and cpp-coveralls
as entry points? Why not just cpp-coveralls
?
I'm getting an SSL cert error. This error does not happen on the Scala (sbt-coveralls) plugin. It might be a coveralls issue, but it started happening a few days ago.
/home/travis/.local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:334: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
SNIMissingWarning
/home/travis/.local/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:132: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
InsecurePlatformWarning
Traceback (most recent call last):
File "/home/travis/.local/bin/coveralls", line 9, in <module>
load_entry_point('cpp-coveralls==0.3.12', 'console_scripts', 'coveralls')()
File "/home/travis/.local/lib/python2.7/site-packages/cpp_coveralls/__init__.py", line 105, in run
return report.post_report(cov_report)
File "/home/travis/.local/lib/python2.7/site-packages/cpp_coveralls/report.py", line 12, in post_report
response = requests.post(URL, files={'json_file': json.dumps(coverage)})
File "/home/travis/.local/lib/python2.7/site-packages/requests/api.py", line 110, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/travis/.local/lib/python2.7/site-packages/requests/api.py", line 56, in request
return session.request(method=method, url=url, **kwargs)
File "/home/travis/.local/lib/python2.7/site-packages/requests/sessions.py", line 488, in request
resp = self.send(prep, **send_kwargs)
File "/home/travis/.local/lib/python2.7/site-packages/requests/sessions.py", line 609, in send
r = adapter.send(request, **kwargs)
File "/home/travis/.local/lib/python2.7/site-packages/requests/adapters.py", line 497, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
Hi, complete noob here, just posting a failed run incase someone can have a look. Run is here: https://travis-ci.org/tbonfort/mapserver/builds/9480619 .
fails with
Traceback (most recent call last):
File "/usr/local/bin/coveralls", line 9, in <module>
load_entry_point('cpp-coveralls==0.0.6', 'console_scripts', 'coveralls')()
File "/usr/local/lib/python2.7/dist-packages/coveralls/__init__.py", line 84, in run
return report.post_report(cov_report)
File "/usr/local/lib/python2.7/dist-packages/coveralls/report.py", line 11, in post_report
response = requests.post(URL, files={'json_file': json.dumps(coverage)})
File "/usr/lib/python2.7/json/__init__.py", line 231, in dumps
return _default_encoder.encode(obj)
File "/usr/lib/python2.7/json/encoder.py", line 201, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/lib/python2.7/json/encoder.py", line 264, in iterencode
return _iterencode(o, 0)
UnicodeDecodeError: 'utf8' codec can't decode byte 0x96 in position 2310: invalid start byte
I tried to dump the JSON payload without success using the following command:
coveralls --dryrun --dump payload.json
Is this feature working?
This package works great - very easy to use, does the job perfectly well.
One missing feature is the ability to aggregate coverage from Python code on the reports pushed to coveralls. Have you ever thought about merging the functionality of https://github.com/z4r/python-coveralls or https://github.com/coagulant/coveralls-python into this package?
Please see https://travis-ci.org/luke-jr/bfgminer/jobs/35702927
It seems to be putting the relative path twice.
See this. The file CMakeFiles/CompilerIdCXX/CMakeCXXCompilerId.cpp
is generated by CMake, which we do not care its coverage at all.
The source of CMake generated file are in CMakeFiles
directory. However, we cannot simple filter CMakeFiles
directory because all object files are stored in CMakeFiles
. If it is filtered, no coverage report will be generated.
To filter cmake generated file, we can only use the full file name. Thus we need to track cmake generated file names, which is not desirable.
Seems like all paths are normalized via os.path.abspath, if this is the case, it would be better to normalize them to absolute in argparse instead of tainting the main code with this. It could possibly also speed up the execution if you happen to have an insanely huge setup, but the biggest win would be less cruft in the code. One could also imagine os.path.exists checks if the user entered something bogus to be able to fail early via generic checks in combination with a meaningful error message.
Since doing this depends on the 'root' parameter, you would have to do this via a custom action, as follows from the Python documentation:
>>> class FooAction(argparse.Action):
... def __init__(self, option_strings, dest, nargs=None, **kwargs):
... if nargs is not None:
... raise ValueError("nargs not allowed")
... super(FooAction, self).__init__(option_strings, dest, **kwargs)
... def __call__(self, parser, namespace, values, option_string=None):
... print '%r %r %r' % (namespace, values, option_string)
... setattr(namespace, self.dest, values)
A single action could be crafted to support both single and multi-occur parameters.
Is there a way to exclude specific lines/blocks from the coverage report? Python's coverage tool has # pragma: no cover
and # pragma: no branch
Not sure if this is a question for cpp-coveralls or coveralls.io.
WIth defaul options the hit counts for inline functions (or class methods defined in headers) are usually inacurrate. This usually happens when a function is referred by more than one module, all being in same directory. The reason is that gcov overrides its *.gcov files. A remedy for this is to use -l
gcov option, i.e:
coveralls --gcov-options '\-l'
IMHO, a good common practice would be to use also -p
gcov option (preserve paths).
I think this is worth to be mentioned in documentation (README). This may save many troubles to people using cpp-coveralls.
When I launch coveralls from the command line, I always get this error log:
$ coveralls
Traceback (most recent call last):
File "/usr/local/bin/coveralls", line 5, in <module>
from pkg_resources import load_entry_point
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 2603, in <module>
working_set.require(__requires__)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 666, in require
needed = self.resolve(parse_requirements(requirements))
File "/System/Library/Frameworks/Python.framework/Versions/2.7/Extras/lib/python/pkg_resources.py", line 565, in resolve
raise DistributionNotFound(req) # XXX put more info here
pkg_resources.DistributionNotFound: requests
I tried with two python version (2.7.2 and 3.3.3) but I always have the same result...
I get this error both local and on the travis build for my test project:
Hello,
after trying around with getting cpp-coveralls to work together with Travis CI I got to a point where I at least don't have any files I don't want, problem is, I don't have any I want either...
Coveralls: https://coveralls.io/builds/624888
Travis build: https://travis-ci.org/02JanDal/MultiMC5/builds/21541969
.travis.yml: https://github.com/02JanDal/MultiMC5/blob/develop/.travis.yml
As can be seen I use --exclude CMakeFiles --exclude mmc_updater --exclude include/config.h --exclude tests/test_config.h --exclude-pattern ".*automoc\.cpp" --exclude-pattern ".*\.moc" --exclude-pattern "/opt/.*" --exclude-pattern "/usr/.*" --exclude-pattern ".*moc_.*" --exclude-pattern ".*ui_.*" --exclude-pattern ".*qrc_.*"
, CMakeFiles gets rid of CMake test stuff (testing compilers etc.), then a third party dependency, two generated files, automoc, .moc, moc_, ui_ and qrc_ are also generated, /opt and /usr are more dependencies. Now for some reason this also gets rid of all of the REAL files that are in logic/ and gui/, any idea why?
I'm adding this to my header-only library (a single header) and I have some unit tests that exercise the code. Take a look here where I set things up. At the moment it shows 100% coverage so that's great ;) Unfortunately the only files that are includes are the .cpp
files from the tests and the header-only library is not there.
What I would like is to exclude the unit tests and just see the header-only library listed in coveralls.io.
Maybe this is already possible but I somehow missed it? Thanks!
If I pass an exclude path regex like (^.*foo\/.*$)
to -E
the file will be excluded from the report if it was executed, but not if it was unexecuted. Because all the unexecuted files get re-added to the report on each run, it effectively makes the -E
option somewhat useless if used for paths.
In relation to my project's build, cpp-coveralls
doesn't seem able to resolve paths properly. I've begun to tinker with a local copy but I'm not sure if I understand how it it determines a path.
Hi @eddyxu, we're rolling out branch coverage support for Coveralls.io and have landed it in two other integration libraries:
Node: nickmerwin/node-coveralls@d571dac
Python: TheKevJames/coveralls-python#145
There are more details about the new branches
parameter here:
https://coveralls.zendesk.com/hc/en-us/articles/201350799-API-Reference
I wanted to get it on your radar now in case you or another maintainer has spare time to add this support to goveralls.
Thank you!
I have project where all source file exists in src
directory.
On travis I install several external libraries.
I think this is more convinient if we can write coveralls -i src
rather then exclude several libraries.
I have fiddled with support for code coverage for our project GlPortal/RadixEngine#30
However it only started producing proper results after I ran the build process twice
GlPortal/RadixEngine@59a6968
Any idea what the problem is?
Here is a log of the build:
https://travis-ci.org/GlPortal/RadixEngine/builds/212253813
When passing --include, all --exclude/--exclude-pattern parameters are ignored and file matching only considers if a file is in any of the include paths or not.
The following seems like a pretty valid usecase that would be nice if it was supported:
coveralls --include src --exclude src/not-this-file.c --exclude-pattern '.*useless.*'
Hi,
We've added cpp-coveralls to https://github.com/surevine/spiffing however when we look in coveralls everything is being thrown into the same build number, suggesting that the travis build number (and other parameters) aren't being pushed to coveralls.io.
Here's our .travis.yml
, https://github.com/surevine/spiffing/blob/master/.travis.yml
Any help sorting this appreciated!
cc: @dwd
In coverall, files without any code are listed with a red 0% coverage even though there is no code to cover!
Some header do not have any code, thus are adding lots of noise information in the coveralls report: the beginning of the report is flooded by 0% covered headers.
It has 4 consequences:
In c++, header usually have a small (sometime big) amount of code so they can not be excluded from coveralls (eg: with --exclude
).
Would it be possible to report files without any code as 100% covered ?
I have a file on /project/root/vobla/status.h' and its
.gcov' file is `/project/root/vobla/status.gcov'. The first line of it is "Source:../vobla/status.h", which the directory for this path should be the current directory instead of the root of project.
In my repository I marked several lines with a comment // LCOV_EXCL_LINE
which works in most of the cases, but not here: https://coveralls.io/builds/3418340/source?filename=pegtl%2Finternal%2Fraise.hh#L26
Hi!
Using pip I get 0.3.12 installed but I cannot find that on GitHub (https://github.com/eddyxu/cpp-coveralls/tags), saying 0.3.11 to b the latest. Did you forget to push that tag?
Best, Sebastian
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.