Code Monkey home page Code Monkey logo

json-report's Introduction

json-report

Actions Status

JSON reporting plugin for Gauge

Installation

Install through Gauge

gauge install json-report

Installing specific version:

gauge install json-report --version 0.1.0

Offline installation

gauge install json-report --file <path_to_plugin_zip_file>

Usage

Add this plugin to your Gauge project by registering it in manifest.json file. You can also do this by:

gauge install json-report

By default, reports are generated in reports/json-report directory of your Gauge project. You can set a custom location by setting the below mentioned property in default.properties file of env/default directory.

#The path to the gauge reports directory. Should be either relative to the project directory or an absolute path
gauge_reports_dir = reports

You can also choose to override the reports after each execution or retain all of them as follows.

#Set as false if gauge reports should not be overwritten on each execution. A new time-stamped directory will be created on each execution.
overwrite_reports = true

Build from Source

Requirements

Compiling

go run build/make.go

For cross-platform compilation

go run build/make.go --all-platforms

Installing

After compilation

go run build/make.go --install

Installing to a CUSTOM_LOCATION

go run build/make.go --install --plugin-prefix CUSTOM_LOCATION

Creating distributable

Note: Run after compiling

go run build/make.go --distro

For distributable across platforms: Windows and Linux for both x86 and x86_64

go run build/make.go --distro --all-platforms

Contributing

  • Identify/pick an issue
  • raise a pull request
  • one of the maintainers should review and merge.

Release

Github Actions have been setup for test/deploy of this project. Tests will run for all pull requests, however to make a release, a deployment has to be triggered.

To do a release:

json-report's People

Contributors

apoorvam avatar brudibanani avatar debashis9012 avatar dependabot[bot] avatar henriheimann avatar nehashri avatar osandadeshan avatar shubhamsc avatar sriv avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

json-report's Issues

"failedScenariosCount" is invalid in the json-report

This is the json report for my execution.
result.zip
At the end of the file, it shows;
"beforeSuiteHookFailure": null, "afterSuiteHookFailure": null, "passedSpecsCount": 14, "failedSpecsCount": 1, "skippedSpecsCount": 0, "passedScenariosCount": 45, "failedScenariosCount": 0, "skippedScenariosCount": 0
Here "failedSpecsCount": 1 and "failedScenariosCount": 0.
But the actual scenario failures are 1. This is the error for that failure
`
Errors:

        [Parse Error] C:\Program Files (x86)\Jenkins\workspace\MaxSoft IntelliAPI\specs\Other\Form-Data Example.spec:20 Dynamic param  could not be resolved, Missing file: C:/client_id.txt => '   |client_id    |    |'
      
        [Parse Error] C:\Program Files (x86)\Jenkins\workspace\MaxSoft IntelliAPI\specs\Other\Form-Data Example.spec:21 Dynamic param  could not be resolved, Missing file: C:/client_secret.txt => '   |client_secret||'
      
        [Parse Error] C:\Program Files (x86)\Jenkins\workspace\MaxSoft IntelliAPI\specs\Other\Form-Data Example.spec:22 Dynamic param  could not be resolved, Missing file: C:/refresh_token.txt => '   |refresh_token||'

`

This is the html-report screenshot
image

HTML report also not identified the failure scenario, it also shows failure scenarios are 0

export json report types to allow unmarshaling

Hi,

I would like to process the json report, and I need the types so I could unmarshal it.

currently types are not exported, could you please export types ( e.g suiteResult )

Thanks.

Scenario Heading is the same for each row of a CSV or Table execution

When running a scenario with a table with many rows, the report shows the same Scenario Heading for each row of the table. This causes confusion as to which row had a failure in order to troubleshoot the issue.

We sometimes have over 100 rows in our CSV files. Troubleshooting can take a very long time narrowing down which row had the failure.

In IntelliJ, when the test runs, the scenario heading has a suffix _<row-number> added at the end. The XML report does the same. It would nice if we can get the same functionality.

Sample:

XML Report
Json Report

Link does not allow us to download in our company the JSON Report Plugin

All,

We are unable to download the JSON Report Plugin as our security suspects that its a dangerous link : https://github.com/**apoorvam**/json-report/releases/download/0.2.2/json-report-0.2.2-windows.x86_64.zip.

I am able to download the XML Report as it does not have the persons name. Can you please have the repo link corrected for the latest version of JSON Report, so that its inline with the standards for others reports and the security team in my company can download this Report plugin for me.

Thanks,
Hari

Execution status should be in past tense

Steps to recreate:

  1. Run a sample project which has all the tests passed
  2. Open json-report xml file
  3. Execution status will be showing as "pass". It should be ""Passed"
  4. Run a sample project which has all the tests failed
  5. Open json-report xml file
  6. Execution status will be showing as "fail". It should be "Failed"
  7. Run a sample project which has all the tests skipped
  8. Open json-report xml file
  9. Execution status will be showing as "skip". It should be "Skipped"

Gauge Version
Gauge version: 1.0.4
Commit Hash: 3a9a647

Plugins

flash (0.0.1)
html-report (4.0.6)
java (0.7.2.nightly-2019-03-07)
js (2.3.2)
json-report (0.2.2)
python (0.3.4)
screenshot (0.0.1)
spectacle (0.1.3)
xml-report (0.2.1)

[Feature] Ability to add user data to the report

The Json report makes it possible to take the results of an execution, parse it and use it in various ways or create other types of reports. However, currently, this reports lacks the ability to add custom user data to the report. The use case for this would be something as follows.

  • Add custom Test Execution ID to the report
  • add context to the tests. For example,
    -- which system or service was being tested
    -- which user executed the test
    -- whether the test was manually executed, part of CI/CD or scheduled etc
    -- many others

One way to do this is to the use Gauge.message() in Java. However, this only works at the step level for the json-report. The HTML Report is able to show messages at the suite, scenario and step level.

The best outcome would be to be able to create your own key/Value data in the json directly.

JSON Report Snippet

	"projectName": "defects-example",
	"timestamp": "May 8, 2020 at 2:02pm",
	"successRate": 100,
        "myData" : "Some Value"
	"specResults": [
         ....

failure screenshot is missing in json report

In recent updates to gauge, we are not able to see the screenshot data in json report. Please see the screenshot.
image

Gauge version details:
Gauge version: 1.1.1
Plugins

html-report (4.0.10)
java (0.7.7)
json-report (0.3.3)
screenshot (0.0.1)
xml-report (0.2.3)

Note: html reports shows failure screenshot, it is only json report has the issue

Concept arguments not showing up in step text.

When using a concept the parameters that are passed in don't show up anywhere in the json report. For example

## Login with concept
* Log into github as "TacoTestAccount" using password "testAcountPassword".

shows up in the report as is with no substitution

"itemType": "concept",
    "conceptStep": {
        "itemType": "step",
        "stepText": "Log into github as \u003cusername\u003e using password \u003cpassword\u003e.",
        "table": null,
        "beforeStepHookFailure": null,
        "afterStepHookFailure": null,
        "result": {
            "status": "pass",
            "stackTrace": "",
	    "screenshot": "",
	    "errorMessage": "",
	    "executionTime": 9317,
	    "skippedReason": "",
	    "messages": [],
	    "errorType": "assertion"
}

Expected Behaviour

Have some way to determine what was passed into the concept. Either through the step text or as another json property.

Version

Gauge version: 1.0.3
Commit Hash: ff6c0c3

Plugins
-------
flash (0.0.1)
html-report (4.0.6)
js (2.3.2)
json-report (0.2.2)
python (0.3.4)
screenshot (0.0.1)
xml-report (0.2.1)

Json plugin reports are not consistently same

Json plugin, for the same test, sometimes shows steps details and sometime shows details only at spec level.

I ran my test twice and both the times the report is different. In one report i see that all the step details are present but the second report does not has any step detail. The behavior is inconsistent.

It is happening for both 0.3.0 and 0.3.2

Attached are the files for your reference.
Json Results.zip


gauge -v
Gauge version: 1.0.4
Commit Hash: 3a9a647

Plugins

dotnet (0.1.2.nightly-2019-02-14)
html-report (4.0.7)
json-report (0.3.2)
screenshot (0.0.1)
xml-report (0.2.1)

ProtoScenario.preHookMessages not visible in Json-Report

Hello,

As a report analyst i want to see the preHookMessages of the scenario in the json report.
In a html-report the preHookMessages of all stages are visible, e.g. here for the scenario stage.

Currently only messages of the step stage are shown in the json report

Context array details are overwritten by teardown data

teardown array object in JSON is always empty. Additionally, the context array is getting overwritten by the teardown array data.

Steps to reproduce-

Create a spec file as below-
Specification Heading

Setup of contexts

  • Context - setup

Scenario Heading

  • Step1
  • Step2

Teardown of contexts

  • Context - teardown

Create context and Fixture classes as below-
In Fixture.java, we have
@com.thoughtworks.gauge.Step("Step1")
public void step1() throws Exception {
}

@com.thoughtworks.gauge.Step("Step2")
public void step2() throws Exception {
}

In Context.java, we can have
@com.thoughtworks.gauge.Step("Context - setup")
public void setUp() throws Exception {
}

@com.thoughtworks.gauge.Step("Context - teardown")
public void tearDown()throws Exception {
}

Now try throwing an exception from context or teardown and observe

Expected Results:
teardown failures should be rendered to the 'teardown' array of the JSON

Actual Result:

  • teardown array of JSON is always blank
  • context array is overwritten by teardown array data

Gauge version: 0.8.1

Plugins

html-report (3.1.0)
java (0.6.2.nightly-2017-03-22)
json-report (0.2.0)

0.3.1 doesn't ship the right binary for linux

Hi there ๐Ÿ‘‹

It seems that the binary of [email protected] has mistakenly been released as [email protected] too.

How to reproduce:

wget https://github.com/getgauge-contrib/json-report/releases/download/v0.3.1/json-report-0.3.1-linux.x86_64.zip
unzip -d 0.3.1 json-report-0.3.1-linux.x86_64.zip

wget https://github.com/getgauge-contrib/json-report/releases/download/0.3.0/json-report-0.3.0-linux.x86_64.zip
unzip -d 0.3.0 json-report-0.3.0-linux.x86_64.zip

sha1sum 0.3.1/bin/json-report 0.3.0/bin/json-report

Actual result:

f74c6f148458e12f806c2f0fe26cce6d46876143  0.3.1/bin/json-report
f74c6f148458e12f806c2f0fe26cce6d46876143  0.3.0/bin/json-report

Expected result:

As the execution statuses have changed to past tense, I am expecting the binaries to have a different hash.

There should be the total number of passed/failed/skipped scenario counts in the json-report

Expected behavior

The total number of passed/failed/skipped scenario counts should be inside the json-report

Actual behavior

The total number of passed/failed/skipped scenario counts are not in the json-report

Steps to reproduce

  1. Run few spec files.
  2. Go to the json-report
  3. There are passed/failed/skipped scenario counts only for spec wise. There is no place to find the total number of passed/failed/skipped scenario counts.

Note: For the Quality Engineers passed/failed/skipped scenario counts is a very important factor to measure the health of a give project. So it would be very much helpful if you can provide this feature..

Gauge version

Gauge version: 0.9.8
Commit Hash: c23df9f

Plugins
-------
csharp (0.10.3)
flash (0.0.1)
html-report (4.0.4)
java (0.6.7)
json-report (0.2.1)
ruby (0.5.0)
screenshot (0.0.1)
spectacle (0.1.2)
xml-report (0.2.0)

Alpine: Error starting plugin JSON Report 0.3.7. fork/exec bin/json-report: no such file or directory

Hi all

I am trying to make a small change to the JSON Report plugin, so I have downloaded this repo and done my changes.
This builds and runs fine locally, however I have a problem when running my tests in a Docker container.
I get the error mentioned in the title:
Error starting plugin JSON Report 0.3.7. fork/exec bin/json-report: no such file or directory.

The thing is, this does not happen if I download the linux.x86_64 release files from here and install them directly from the .zip.
If I build from source the error occurs even if I have not done any changes to the code.

Any ideas as to what might be wrong?

I'm buillding from go version go1.20.3 linux/amd64 on Debian and am trying to run the plugin in an Alpine container.
It works in another Debian container, but I am trying to make my Gauge image as small as possible ...

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.