Code Monkey home page Code Monkey logo

go-junit-report's Introduction

go-junit-report

go-junit-report is a tool that converts go test output to a JUnit compatible XML report, suitable for use with applications such as Jenkins.

Build status Go Reference Go Report Card

Install from package (recommended)

Pre-built packages for Windows, macOS and Linux are found on the Releases page.

Install from source

Download and install the latest stable version from source by running:

go install github.com/jstemmer/go-junit-report/v2@latest

Usage

By default, go-junit-report reads go test -v output generated by the standard library testing package from stdin and writes a JUnit XML report to stdout.

Go build and runtime errors are also supported, but this requires that stderr is redirected to go-junit-report as well.

Typical use looks like this:

go test -v 2>&1 ./... | go-junit-report -set-exit-code > report.xml

More examples

JSON produced by go test -json is supported by the gojson parser. Note that stderr still needs to be redirected to go-junit-report in order for build errors to be detected. For example:

go test -json 2>&1 | go-junit-report -parser gojson > report.xml

Go benchmark output is also supported. The following example runs benchmarks for the package in the current directory and uses the -out flag to write the output to a file called report.xml.

go test -v -bench . -count 5 2>&1 | go-junit-report -out report.xml

The -iocopy flag copies stdin directly to stdout, which is helpful if you want to see what was sent to go-junit-report. The following example reads test input from a file called tests.txt, copies the input to stdout and writes the output to a file called report.xml.

go-junit-report -in tests.txt -iocopy -out report.xml

Flags

Run go-junit-report -help for a list of all supported flags.

Flag Description
-in file read go test log from file
-iocopy copy input to stdout; can only be used in conjunction with -out
-no-xml-header do not print xml header
-out file write XML report to file
-package-name name specify a default package name to use if output does not contain a package name
-parser parser specify the parser to use, available parsers are: gotest (default), gojson
-p key=value add property to generated report; properties should be specified as key=value
-set-exit-code set exit code to 1 if tests failed
-subtest-mode set subtest mode, modes are: ignore-parent-results, exclude-parents
-version print version and exit

Go packages

The test output parser and JUnit XML report generator are also available as Go packages. This can be helpful if you want to use the go test output parser or create your own custom JUnit reports for example. See the package documentation on pkg.go.dev for more information:

Changelog

v2.1.0

  • Fix #147: Make timestamps in generated report more accurate.
  • Fix #140: Escape illegal XML characters in junit output.
  • Fix #145: Handle build errors in test packages with the _test suffix.
  • Fix #145: Don't ignore build errors that did not belong to a package.
  • Fix #134: Json test output was not parsed correctly when using the -race flag in go test.
  • Add support for === NAME lines introduced in Go1.20
  • junit: Add File attribute to testsuite.
  • junit: Allow multiple properties with the same name.
  • junit: Add the Testsuites.WriteXML convenience method.

v2.0.0

  • Support for parsing go test -json output.
  • Distinguish between build/runtime errors and test failures.
  • JUnit report now includes output for all tests and benchmarks, and global output that doesn't belong to any test.
  • Use full Go package name in generated report instead of only last path segment.
  • Add support for reading skipped/failed benchmarks.
  • Add -subtest-mode flag to exclude or ignore results of subtest parent tests.
  • Add -in and -out flags for specifying input and output files respectively.
  • Add -iocopy flag to copy stdin directly to stdout.
  • Add -prop flags to set key/value properties in generated report.
  • Add -parser flag to switch between regular go test (default) and go test -json parsing.
  • Output in JUnit XML is written in <![CDATA[]]> tags for improved readability.
  • Add hostname, timestamp and id attributes to JUnit XML.
  • Improve accuracy of benchmark time calculation and update formatting in report.
  • No longer strip leading whitespace from test output.
  • The formatter and parser packages have been replaced with junit and parser/gotest packages respectively.
  • Add support for parsing lines longer than 64KiB.
  • The JUnit errors/failures attributes are now required fields.
  • Drop support for parsing pre-Go1.13 test output.
  • Deprecate -go-version flag.

Contributing

See CONTRIBUTING.md.

go-junit-report's People

Contributors

ascandella avatar benzaita avatar brittinator avatar cameron-dunn-sublime avatar greg-dennis avatar ikarishinjieva avatar ixdy avatar janisz avatar jmillikin-stripe avatar johnschnake avatar jstemmer avatar liggitt avatar mark-rushakoff avatar mattdelco avatar nickpalmer avatar nipeharefa avatar nmiyake avatar nullbio avatar pascalbourdier avatar posener avatar sdowell avatar themichaellai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

go-junit-report's Issues

Some issues pulling down the code with go

Following the readme:

go get -u github.com/jstemmer/go-junit-report

returns a warning (or failure). On my local Mac, I get this:

package github.com/jstemmer/go-junit-report: github.com/jstemmer/go-junit-report is a custom import path for https://github.com/jstemmer/go-junit-report, but /Users/daluu/go/src/github.com/jstemmer/go-junit-report is checked out from [email protected]:jstemmer/go-junit-report

the code however does get pulled down to my Mac, but it doesn't automatically get built. So I had to manually build/install it afterwards with a go install.

and I was also testing use of it under CircleCI, which failed the CI build with this message:

imports github.com/jstemmer/go-junit-report/parser: /home/ubuntu/.go_workspace/src/github.com/jstemmer/go-junit-report is from [email protected]:jstemmer/go-junit-report, should be from https://github.com/jstemmer/go-junit-report Action failed: go get -u github.com/jstemmer/go-junit-report

also, don't know if it makes any difference, but I'm using SSH keys with Github for private repo authentication rather than providing username and password, but that probably shouldn't matter with this public repo I assume.

So in general, I have to do git clone, cd to the repo, then go install. For comparison, I did not encounter this issue with https://github.com/tebeka/go2xunit, although I prefer your solution over that one.

Anyone else ran into this before? Do you know what the underlying issue is? I've never hit this issue with other Github repos of go code.

Publishing Test Results in Azure Pipelines gives invalid character error

Configured a pipeline to test and publish results. The test is scripted to test a terraform Plan output.
when running the tasks:
go get -u github.com/jstemmer/go-junit-report
go test -v -timeout 60m 2>&1 | go-junit-report > report.xml
and Publish test results
The pipline passes but the publish test result task give the following error
##[warning]Failed to read D:\a\1\go\src\s***\CentralUS\Global\VirtualNetwork\test\report.xml. Error : '.', hexadecimal value 0x00, is an invalid character. Line 2, position 1..
No build level attachments to publish.
Finishing: Publish Test Results D:\a\1\go\src\s**\report.xml

Failed test output can get lost with repeated and parallel test output

I've been having issues with test suites that are run in parallel and repeated, the relevant test output seems to get lost or mixed up.

This minimal (hand-crafted) example seems to demonstrate the issue:

=== RUN   TestRepeat
=== RUN   TestRepeat
--- FAIL: TestRepeat (0.02 seconds)
	file_test.go:11: This is some debug logging
	file_test.go:11: Error message
	file_test.go:11: Longer
		error
		message.
--- PASS: TestRepeat (0.00s)
FAIL
FAIL      package/repeat-and-parallel    0.001s

This produces the following output with no failure message output:

<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
        <testsuite tests="2" failures="1" time="0.001" name="package/repeat-and-parallel">
                <properties>
                        <property name="go.version" value="go1.10.3"></property>
                </properties>
                <testcase classname="repeat-and-parallel" name="TestRepeat" time="0.000">
                        <failure message="Failed" type=""></failure>
                </testcase>
                <testcase classname="repeat-and-parallel" name="TestRepeat" time="0.000"></testcase>
        </testsuite>
</testsuites>

The problem goes away if I either give the tests unique names (as if they weren't repeated) or I change the RUN lines so as to not be interleaved (as if it weren't in parallel).

build failures are ignored

One of our tests is currently failing to build.

Output from go 1.4.2 looks like the following:

# k8s.io/kubernetes/pkg/registry/thirdpartyresourcedata/etcd
_output/local/go/src/k8s.io/kubernetes/pkg/registry/thirdpartyresourcedata/etcd/etcd_test.go:74: not enough arguments in call to test.TestCreate
_output/local/go/src/k8s.io/kubernetes/pkg/registry/thirdpartyresourcedata/etcd/etcd_test.go:100: not enough arguments in call to test.TestUpdate
FAIL    k8s.io/kubernetes/pkg/registry/thirdpartyresourcedata/etcd [build failed]

Passing this through go-junit-report, however, it's completely ignoring the build failure:

<?xml version="1.0" encoding="UTF-8"?>
<testsuites></testsuites>

Using go-junit-report with compiled go test

I'm trying to use go-junit-report to convert the output of a go test binary file as such:
./example.test | go-junit-report > results.xml
When I cat results.xml it returns an empty xml template like this:
<testsuites><testsuites>

How would I go about using go-junit-report without having to do go test -v

Tests that may have written to stdout (without a trailing newline) fail

I think this may be the cause of #30 as I had very similar results with some code of mine.

The regexp used to match the go test -v output mandates that the --- PASS (and other tokens) be at the start of the line. However if you had a test that did something like fmt.Printf("SUCCESS") the line that go-junit-report tries to parse looks like:
SUCCESS--- PASS ... and so it fails to see the resolution of that test at all and marks it a failure.

Is it open to consideration that the "start of line" restriction be removed from the regexp so that any test like that would pass? go test is not in any way ensuring that their remarks about the test (which is what this tool tries to parse) are at the start of a line so I don't know why go-junit-report should mandate it.

Summary of the test is not considered for Fail/Pass status of the test

Looking into the code:

} else if regexSummary.MatchString(line) {

Looks like if summary is FAIL, the test still going to be marked as Pass.

Example:
=== RUN TestApplyThrice
2019-05-02T02:00:14.273826Z info Graceful termination period is -10s, starting...
--- PASS: TestApplyThrice (0.00s)
2019-05-02T02:00:14.274025Z info Epoch 0 starting
panic: Fail in goroutine after TestApplyThrice has completed
FAIL istio.io/istio/pilot/pkg/proxy 0.008s

Cant see test logs in junit result xml

After tests are finished I can see the final result but I cannot see any log output using t.log method from the test ..

I ran the following
go test -v ./... 2>&1 | go-junit-report > result.xml

I can see all my logs clearly when not piping into go-junit-report

But inside the xml there are none..

Is it supported ?

Tests with common name are represented incorrectly

With testify we can create suites. Different suites may contain tests with one name. In such case go-junit-report defines second test in go test output as failed even if the second test passed.

Example:
go test output:

=== RUN   TestDiskWriteSuite
=== RUN   TestX
--- PASS: TestX (0.00s)
--- PASS: TestDiskWriteSuite (0.00s)
=== RUN   TestDiskWriteSuite2
=== RUN   TestX
--- PASS: TestX (0.00s)
--- PASS: TestDiskWriteSuite2 (0.01s)
PASS
ok  	command-line-arguments	0.018s

go-junit-report output:

<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
	<testsuite tests="4" failures="1" time="0.018" name="command-line-arguments">
		<properties>
			<property name="go.version" value="go1.7"></property>
		</properties>
		<testcase classname="command-line-arguments" name="TestDiskWriteSuite" time="0.000"></testcase>
		<testcase classname="command-line-arguments" name="TestX" time="0.000"></testcase>
		<testcase classname="command-line-arguments" name="TestDiskWriteSuite2" time="0.010"></testcase>
		<testcase classname="command-line-arguments" name="TestX" time="0.000">
			<failure message="Failed" type=""></failure>
		</testcase>
	</testsuite>
</testsuites>

Tests may have common name if we don't use testify suites also.
Inside TestMain all package tests can be invoked multiple times via Run method (for example, with different global settings).
It will also produce repeated test names in go test log.

BTW go2xunit handles that case correctly and shows two tests separately.

Add properties to formatter.JUnitTestCase

Test cases can have properties. Would it be possible to add a Properties field to formatter.JUnitTestCase?

I'm doing some post processing of the output XML and would like to be able to unmarshal, add properties to test cases, and marshal again.

I'd be happy to send a PR if you're open to it.

Incompatibility with go test's -coverpkg flag

When invoking go-junit-report -go-version 1.8.3 -set-exit-code:

Without the -coverpkg flag:

coverage: 60.6% of statements
<properties>
	<property name="go.version" value="1.8.3"></property>
	<property name="coverage.statements.pct" value="60.6"></property>
</properties>

With the -coverpkg flag:

coverage: 86.4% of statements in example.com/package/name
<properties>
	<property name="go.version" value="1.8.3"></property>
</properties>

associate filename with test

Hi,
I use your tools for my sonarqube plugin.

In sonarqube we must associate the tests with the test file.

But actually all the tests is associate with the package. So is it possible adding the information about the filename that contain the test ?

Thanks.
Regards.

"go.version" property uses version of Go used to compile the tool

The go.version property in the output XML test report is the version of Go used to compile the go-junit-report tool (as opposed to the version of Go used to run the tests), which can be confusing.

(1) Compile go-junit-report using Go 1.7.5
(2) Change the local version of go to Go 1.8, run unit tests, and then use go-junit-report to transform the output.

Expected

go.version should be go1.8

Actual

go.version is go1.7.5

Cause

This is due to the call at https://github.com/jstemmer/go-junit-report/blob/master/junit-formatter.go#L82. This property is set and compiled into the go-junit-report binary itself, so once the binary is compiled the version is set.

There are 2 possible approaches I can think of for addressing this:

  1. Make the program call go version and parse out the version string from there
  • This ensures that the version matches the version used by go at the time of invocation
  • However, this is not guaranteed to be correct since the original test output being processed may have been run using a different version of Go
  1. Drop the go.version property altogether
  • Since it's impossible to determine the version of go that was used to generate the output being parsed, it might be better to omit this version entirely rather than to guess based on the runtime information or go executable information

Although this isn't critical, I think it would be nice to address. I'm running into this in my CI environment, where I build go-junit-report once, but then run my tests using different versions of go and process its output. All of the test output reports have a go.version property with a value of go1.7.5 (the version of Go used to compile go-junit-report), but this is confusing to have for tests that were actually run using Go 1.8.

Benchmark ability only works with 2-9 CPUs

When running benchmarks, normally the CPU number is equal to the GOMAXPROCS. On different hosts, this can be 1, 8, 16, etc. Furthermore, a person running benchmarking can set this themselves to any number she wants. See go/src/cmd/go/alldocs.go for more details.

What this equates to is when go test -bench returns, it returns the name of the benchmark appended with the number of CPUs it ran on.

ex:

$go test -v  -bench -cpu 1
Benchmark 	  100000	     16808 ns/op

$go test -v  -bench -cpu 4
Benchmark-4   	  100000	     13534 ns/op

$go test -v -run asldkf -bench Lookup -cpu 16
Benchmark-16    	  100000	     13571 ns/op

In order to for the benchmarking ability to be useful on all hosts, the regex needs to be adjusted slightly to account for these small variances.

Test report will be all failure with Go1.5

Because of the test case identifying logic:

 if strings.HasPrefix(line, "=== RUN ") {
                    // new test
                    cur = line[8:]
                    tests = append(tests, &Test{
                            Name:   line[8:],
                            Result: FAIL,
                            Output: make([]string, 0),
                    })
            }

And since go 1.5 tweaked the test out put a little bit, there will be 2 more space before the actual name in the generated report, like:

<testcase classname="A" name="  TestA" time="0.000">
    <failure message="Failed" type=""></failure>
</testcase>

As a result, this program will not be able to update the actually status of the test cases because it could not find any matching test cases via findTest() function.

Suggest that we use regex to get the test case name, just like how we get test status.

Might submit a pull request later.

Benchmark failure not correctly tracked

I expect that if a benchmark that I'm running fails, the XML report will note that failure. This is not necessarily the case, however.

Here is an example:

command:
go test -v -bench=. -test.benchtime=10s ./bench.go

go test output:

goos: linux
goarch: amd64
Benchmark_Sample1-4       	   50000	    227422 ns/op
--- FAIL: Benchmark_Sample10
    bench.go:43: Unexpected error
--- FAIL: Benchmark_Sample100
    bench.go:43: Unexpected error
--- FAIL: Benchmark_Sample1000
    bench.go:43: Unexpected error
--- FAIL: Benchmark_Sample10000
    bench.go:43: Unexpected error
FAIL
exit status 1
FAIL	command-line-arguments	13.986s

go-junit-report output:

<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
        <testsuite tests="1" failures="0" time="13.986" name="command-line-arguments">
                <properties>
                        <property name="go.version" value="go1.11"></property>
                </properties>
                <testcase classname="command-line-arguments" name="Benchmark_Sample1" time="0.000328058"></testcase>
        </testsuite>
</testsuites>

Note that the four failed benchmarks are not included in the XML output

Data race failures aren't reported in XML output.

I think this is caused by the individual run passing, while the overall suite fails. I'm not sure what the best way to represent this in the junit.xml-- maybe make the closest testcase fail, especially if it's interleaved?

Example (from kubernetes/kubernetes#35872):

=== RUN   TestNamespaceController
==================
WARNING: DATA RACE
Read at 0x00c420237758 by goroutine 20:
  reflect.Value.IsNil()
      /usr/local/go/src/reflect/value.go:975 +0xd8
  reflect.deepValueEqual()
      /usr/local/go/src/reflect/deepequal.go:70 +0xe8b
  reflect.deepValueEqual()
      /usr/local/go/src/reflect/deepequal.go:97 +0xe09
  reflect.deepValueEqual()
      /usr/local/go/src/reflect/deepequal.go:97 +0xe09
  reflect.deepValueEqual()
      /usr/local/go/src/reflect/deepequal.go:94 +0xc9b
  reflect.DeepEqual()
      /usr/local/go/src/reflect/deepequal.go:185 +0x2bb
  k8s.io/kubernetes/federation/pkg/federation-controller/util.NewTriggerOnAllChanges.func3()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/util/handlers.go:42 +0xc7
  k8s.io/kubernetes/pkg/client/cache.ResourceEventHandlerFuncs.OnUpdate()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/client/cache/controller.go:175 +0x6f
  k8s.io/kubernetes/pkg/client/cache.(*ResourceEventHandlerFuncs).OnUpdate()
      <autogenerated>:51 +0xb0
  k8s.io/kubernetes/pkg/client/cache.NewInformer.func1()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/client/cache/controller.go:241 +0x3ab
  k8s.io/kubernetes/pkg/client/cache.(*DeltaFIFO).Pop()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/client/cache/delta_fifo.go:420 +0x36b
  k8s.io/kubernetes/pkg/client/cache.(*Controller).processLoop()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/client/cache/controller.go:126 +0x70
  k8s.io/kubernetes/pkg/client/cache.(*Controller).(k8s.io/kubernetes/pkg/client/cache.processLoop)-fm()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/client/cache/controller.go:102 +0x41
  k8s.io/kubernetes/pkg/util/wait.JitterUntil.func1()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/util/wait/wait.go:87 +0x6f
  k8s.io/kubernetes/pkg/util/wait.JitterUntil()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/util/wait/wait.go:88 +0xbd
  k8s.io/kubernetes/pkg/util/wait.Until()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/util/wait/wait.go:49 +0x5a
  k8s.io/kubernetes/pkg/client/cache.(*Controller).Run()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/pkg/client/cache/controller.go:102 +0x24e

Previous write at 0x00c420237758 by goroutine 23:
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.(*NamespaceController).addFinalizerFunc()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller.go:228 +0x159
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.(*NamespaceController).(k8s.io/kubernetes/federation/pkg/federation-controller/namespace.addFinalizerFunc)-fm()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller.go:174 +0x76
  k8s.io/kubernetes/federation/pkg/federation-controller/util/deletionhelper.(*DeletionHelper).EnsureFinalizers()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/util/deletionhelper/deletion_helper.go:101 +0x39f
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.(*NamespaceController).reconcileNamespace()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller.go:359 +0x29a
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.(*NamespaceController).Run.func2()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller.go:272 +0xb8
  k8s.io/kubernetes/federation/pkg/federation-controller/util.(*DelayingDeliverer).StartWithHandler.func1()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/util/delaying_deliverer.go:176 +0x105

Goroutine 20 (running) created at:
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.(*NamespaceController).Run()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller.go:264 +0x73
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.TestNamespaceController()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller_test.go:113 +0x151a
  testing.tRunner()
      /usr/local/go/src/testing/testing.go:610 +0xc9

Goroutine 23 (running) created at:
  k8s.io/kubernetes/federation/pkg/federation-controller/util.(*DelayingDeliverer).StartWithHandler()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/util/delaying_deliverer.go:181 +0x56
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.(*NamespaceController).Run()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller.go:273 +0x155
  k8s.io/kubernetes/federation/pkg/federation-controller/namespace.TestNamespaceController()
      /go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/federation/pkg/federation-controller/namespace/namespace_controller_test.go:113 +0x151a
  testing.tRunner()
      /usr/local/go/src/testing/testing.go:610 +0xc9
==================
--- PASS: TestNamespaceController (1.16s)
PASS
Found 1 data race(s)
FAIL	k8s.io/kubernetes/federation/pkg/federation-controller/namespace	3.373s
=== RUN   TestParseFederationReplicaSetReference
--- PASS: TestParseFederationReplicaSetReference (0.00s)

output:

<testsuite tests="1" failures="0" time="3.373" name="k8s.io/kubernetes/federation/pkg/federation-controller/namespace">
<properties>
<property name="go.version" value="go1.7.1"/>
</properties>
<testcase classname="namespace" name="TestNamespaceController" time="1.160"/>
</testsuite>

The error message should be more meaningful

I just run into an issue that if the 'go test -v' output is like the following:

=== RUN TestManifestOK
receivedManifestFile = true
manifestBytes : {
"name" : "mysampleapp",
"description" : "This is my hello world sample app",
"version" : "1.0"
}--- PASS: TestManifestOK (0.00s)

Then the generated junit report will be like this:

    <testcase classname="builder" name="TestManifestOK" time="0.000">
        <failure message="Failed" type=""></failure>
    </testcase>

The 'Failed' reason I think is the '}' leading the '--PASS' which caused the parser cannot read it correctly, it is fine but hope the failure message could be more meaningful otherwise people get hard to know what is happening if he is not familiar with go-junit-report.

Can't import the package

Hello,

When I try to go get -u -v github.com/jstemmer/go-junit-report, I get this error:
`runtime

runtime

/usr/local/go/src/runtime/stubs_x86.go:10:6: stackcheck redeclared in this block
previous declaration at /usr/local/go/src/runtime/stubs_amd64x.go:10:6
/usr/local/go/src/runtime/unaligned1.go:11:6: readUnaligned32 redeclared in this block
previous declaration at /usr/local/go/src/runtime/alg.go:321:40
/usr/local/go/src/runtime/unaligned1.go:15:6: readUnaligned64 redeclared in this block
previous declaration at /usr/local/go/src/runtime/alg.go:329:40`

What does it mean? Can you help me?

Bench support

Hello, me and my colleague @brittinator are using go-junit-report to convert test results to junit.xml and then jenkins uses it to shows us stats.

junit.xml has a "time" field and go-junit-report fills it as well. But in go, we care about "time" mostly at Benchmark tests. We started to look for solution. As a first step, we think about results of "go test -bench=. -run= ..." in the junit.xml as well and use "time" field.

I think we can even do it without extra modes/flags but naturally like:

go test >test.log
go test -bench=. -run=BenchmarkIpsHistoryLookup -benchmem -count 20 project/* >> test.log
cat test.log | | go-junit-report > report.xml

Note that we want to make a smart converter that takes in account -count at least and I think we can avoid regression (or we can add flags if there is any risk)

The questions:

  1. Have you considered adding Benchmarks to the tool? If so, please share details or give us a link to previous discussion.
  2. We are willing to contribute and provide this functionality. Would you accept it to main branch?

set exit code when go test "setup failed"

When go has a "setup failed", using -set-exit-code still returns a 0 exit code.
Related to #46

example for test output:

# github.com/name/project/package
packge/file.go:15:2: cannot find package "other/package" in any of:
	/path/venodr (vendor tree)
	/path/go/root (from $GOROOT)
	/path/go/path (from $GOPATH)
FAIL	github.com/name/project/package [setup failed]

False Failed reports for go test with -cpu flags

When we use go test -v -cpu 1,2,4 command which makes tests to run three times with the different GOMAXPROCS settings go-junit-report generates lots of false Failed tests in the xml report.

Does this integrate with Goconvey BDD test code?

I have stumbled upon, and recently made use of, the Goconvey BDD framework. It is, as BDD frameworks usually are, great. Unfortunately, the test reports are not exportable.

Will this tool run, and report any test fails on, Goconvey code?

Subtests that fail report their parent tests as separate test cases

This might be unavoidable, but it's a little annoying -

We maintain a Google App Engine app with a test suite that relies on an app engine shared dev server. We initialize the dev server in one test and defer its shutdown while running all other tests as subtests, e.g.:

func TestAuthenticationHandlers(t *testing.T) {
	var err error
	testInstance, err = aetest.NewInstance(nil)
	if err != nil {
		t.Fatalf("Failed to create instance: %v", err)
	}
	defer testInstance.Close()

	t.Run("AuthenticationHandlerTests", func(t *testing.T) {
		t.Run("LoginHandler", testLoginHandler)
		t.Run("RegisterHandler", testRegisterHandler)
		t.Run("VerifyEmailHandler", testVerifyEmailHandler)
		t.Run("ResetPasswordHandler", testResetPasswordHandler)
		t.Run("SetPasswordHandler", testSetPasswordHandler)
		t.Run("TestChangePasswordHandler", testChangePasswordHandler)
		t.Run("TestTOTPChangeSeedHandler", testTOTPChangeSeedHandler)
		t.Run("TestTOTPSetupHandler", testTOTPSetupHandler)
		t.Run("TestTOTPVerificationHandler", testTOTPVerificationHandler)
		t.Run("TestTOTPConfirmChangeHandler", testTOTPConfirmChangeHandler)
		t.Run("TestTOTPConfirmSetupHandler", testTOTPConfirmSetupHandler)
	})
}

If one of those subtests fails (let's say LoginHandler), go-junit-report describes 3 separate tests as having failed:
TestAuthenticationHandlers, TestAuthenticationHandlers/AuthenticationHandlerTests and TestAuthenticationHandlers/AuthenticationHandlerTests/LoginHandler

I'm not sure if there's a way to only report the lowest-level test failing, but it'd certainly be useful.

The only obvious solution I can think of to our problem is to move away from this model of running tests and just use TestMain, but I thought I'd bring it up anyways.

Test output from cmp.Diff missing `+` and `-` signs after go-junit-report parsing

What happened: test output from cmp.Diff contains + and - signs in front of each line with diffs, these were ignored in xml output. See #94 for example, the test result is available here: https://travis-ci.org/jstemmer/go-junit-report/jobs/583821403

Example input:

=== RUN TestOne
--- FAIL: TestOne (0.02 seconds)
	file_test.go:11: Error message
	file_test.go:11:
		t{
              	{
            - 		D1,
            + 		D2,
              	},
		}
=== RUN TestTwo
--- PASS: TestTwo (0.13 seconds)
FAIL
exit status 1
FAIL	package/name 0.151s

Expect:

file_test.go:11: Error message
        file_test.go:11: t
        	{
        -	D1,
        +	D2,
        },
        	}

Got:

file_test.go:11: Error message
        file_test.go:11:
        	t{
        {
        	D1,
        	D2,
        },
        	}

Subtests (go 1.7) interpreted as failures

The following output from go test -v | go-junit-report results in false positives for errors.

Test results

=== RUN   TestFUpload
=== RUN   TestFUpload/POSTs_with_filename_body_result_in_200
--- PASS: TestFUpload (0.00s)
    --- PASS: TestFUpload/POSTs_with_filename_body_result_in_200 (0.00s)
=== RUN   TestFDownload
=== RUN   TestFDownload/GETs_return_signed_url
--- PASS: TestFDownload (0.00s)
    --- PASS: TestFDownload/GETs_return_signed_url (0.00s)
=== RUN   TestFDownloadHandlesEdges
--- PASS: TestFDownloadHandlesEdges (0.00s)
=== RUN   TestHealthEndpoint
--- PASS: TestHealthEndpoint (0.00s)
PASS
ok      github.com/ployst/ployst/pfs/fileservice        0.027s
<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
        <testsuite tests="6" failures="2" time="0.023" name="github.com/ployst/ployst/pfs/fileservice">
            <properties>
                <property name="go.version" value="go1.5.1"></property>
            </properties>
            <testcase classname="fileservice" name="TestFUpload" time="0.000"></testcase>
            <testcase classname="fileservice" name="TestFUpload/POSTs_with_filename_body_result_in_200" time="0.000">
                <failure message="Failed" type=""></failure>
            </testcase>
            <testcase classname="fileservice" name="TestFDownload" time="0.000"></testcase>
            <testcase classname="fileservice" name="TestFDownload/GETs_return_signed_url" time="0.000">
                <failure message="Failed" type=""></failure>
            </testcase>
            <testcase classname="fileservice" name="TestFDownloadHandlesEdges" time="0.000"></testcase>
            <testcase classname="fileservice" name="TestHealthEndpoint" time="0.000"></testcase>
        </testsuite>
</testsuites>

Also not sure why golang is being detected by this library as 1.5.1
$ go version
go version go1.7.1 darwin/amd64

Can't use the types to unmarshal junit

Effectively, I would assume that I could use the types defined here to marshal/unmarshal the data repeatedly but that just isnt the case.

See vmware-tanzu/sonobuoy#849 for details but effectively I think the testsuite/testcases just need and additional annotation to facilitate this.

Skipped tests are interperted as failures

when running

package main

import "testing"

func TestNothing(t *testing.T) {
  t.Skip("skip")
}

with

go test -v % | go-junit-report

The output is

<?xml version="1.0" encoding="UTF-8"?><testsuite tests="1" failures="1" time="0.006" name="command-line-arguments">
    <properties>
        <property name="go.version" value="go1.2"></property>
    </properties>
    <testcase classname="command-line-arguments" name="TestNothing" time="0.000">
        <failure message="Failed" type="">tmp_test.go:6: skip</failure>
    </testcase>
</testsuite>

Add flag to prefix testsuite name

I would like the ability to prefix the testsuite name with a custom string.

This will allow me to run tests, and generate unique XML reports for various combinations of $GOOS, $GOARCH, and other custom build tags.

E.g: -package-prefix="linux/arm/"

<testsuite ... name="linux/arm/github.com/example/example">

This is mainly so that I can have Jenkins read multiple junit reports, and aggregate them in a single pipeline job:

Screenshot 2020-01-14 at 12 40 27

Would you be open to a PR that adds a flag that will prepend something to the test suite package name like above?

Test output is assumed to be tab-prefixed

It may be true for test output coming from the testing package's Log methods, but I have some tests with meaning log output and failures generated by gocheck that aren't included in the XML because they don't start with a tab.
Where did the assumption of tab-prefixed console output come from?
Could we just remove that check in the parser?

Distinguish between test errors and failures.

Currently, a test "error" is classified the same as a test "failure". A failure occurs when the Go test framework concludes that the test fails and prints "FAIL" to STDOUT. A test error can happen when an incorrectly written test is able to escape the test framework. One example of this is when a test exits early. Tests that exit early will interrupt the current test, and any remaining tests in the same package from running. go-junit-report will interpret these tests as failures, when in fact they are neither a success nor failure.

Relates to #101

go-junit-report: command not found

I have a test named plan_test.go which does some terraform plan and logs resource changes, in the GOPATH. When i run:
go install github.com/jstemmer/go-junit-report
go test -v 2>&1 | go-junit-report > report.xml
it throws an error in bash and Powershell as following

bash: go-junit-report: command not found

go-junit-report : The term 'go-junit-report' is not recognized as the name of a cmdlet, function, script file, or
operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try
again.

Multiple test suites are not wrapped in <testsuites> tag

The current output of go test ./... -v | go-junit-report is of the form

<?xml version="1.0" encoding="UTF-8"?><testsuite tests="3" failures="1" time="0.007" name="_/Users/peter/tmp/tests">
    <properties>
        <property name="go.version" value="go1.3"></property>
    </properties>
    <testcase classname="tests" name="TestFail" time="0.000">
        <failure message="Failed" type="">tmp_test.go:12: fail</failure>
    </testcase>
</testsuite>
<testsuite tests="3" failures="1" time="0.007" name="_/Users/peter/tmp/tests/subdir">
    <properties>
        <property name="go.version" value="go1.3"></property>
    </properties>
    <testcase classname="subdir" name="TestFail" time="0.000">
        <failure message="Failed" type="">tmp_test.go:12: fail</failure>
    </testcase>
</testsuite>

It should be of the form

<?xml version="1.0" encoding="UTF-8"?>
<testsuites tests="2" failures="2" time="0.014" name="_/Users/peter/tmp/tests">
  <testsuite tests="1" failures="1" time="0.007" name="_/Users/peter/tmp/tests">
    <properties>
      <property name="go.version" value="go1.3"></property>
    </properties>
    <testcase classname="tests" name="TestFail" time="0.000">
      <failure message="Failed" type="">tmp_test.go:12: fail</failure>
    </testcase>
  </testsuite>
  <testsuite tests="1" failures="1" time="0.007" name="_/Users/peter/tmp/tests/subdir">
    <properties>
      <property name="go.version" value="go1.3"></property>
    </properties>
    <testcase classname="subdir" name="TestFail" time="0.000">
      <failure message="Failed" type="">tmp_test.go:12: fail</failure>
    </testcase>
  </testsuite>
</testsuites>

Mis-detected failures with `gb test -v`

In general, gb test seems to be compatible (it's supposed to be), but a couple of random things seem to be failing.

Plain gb test -v results:
testresults.txt

After go-junit-report:
testresults.xml.txt

The TestStop and TestLaunch failures might be #18, but I'm not sure why others wouldn't have also failed. I'm not sure at all why the others are getting marked failed, however, since there's no output of any sort that suggests it should.

Any ideas? I'd love to be able to have jenkins parse these results for me.

Nested TestSuites

Does this current implementation allow for nested suites? Say, if my package structure looks like:

.
└── parent
    ├── childA
    │   └── grandChildA
    └── childB

Which test suites would be created? Will the grandChildA suite be nested within childA suite?

Json Output

Would it be possible to have this support either xml or json?

Packages that panic/fail before running any test are not reported

If a package has tests but fails in an init(), the package in the report has no indication of this failure.

In case of a panic in an init(), the output will look like:

panic: the description of the panic here

goroutine 1 [running]:
panic(....)
   /usr/local/go/src/runtime/panic.go:500....
...more stack frames...
...more goroutines...
FAIL  package/name 0.01s

Fail in the TestMain not recognized

Hey.
We have a file tests.go which does some initialization in the func TestMain(m *testing.M).

If initialization fails go-junit-report does not recognize it as a fail (probably because test name is missing in the output, since there are no test name yet). Example:

?   	my_pacakge/file1	[no test files]
?   	my_pacakge/file2	[no test files]
time="2018-03-28T12:47:03Z" level=fatal msg="cannot connect to mysql: ..."
FAIL	my_packege/tests	0.156s
=== RUN   Test1
--- PASS: Test1 (0.00s)
=== RUN   Test2
--- PASS: Test2 (0.00s)

How we start tests:
go test -v ./... 2>&1 | tee /dev/stderr | go-junit-report -set-exit-code > test-reports/report.xml

Since fail was not recognized commands exit with 0 exit code.

We figured out the workaround, running:
set -o pipefail; go test -v ./... 2>&1 | tee /dev/stderr | go-junit-report -set-exit-code > test-reports/report.xml this way command exits with non-0 exitcode. But would be nice to have it natively handled with proper report.

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.