logzio / logz-docs Goto Github PK
View Code? Open in Web Editor NEWLogz.io user documentation
Home Page: https://docs.logz.io
Logz.io user documentation
Home Page: https://docs.logz.io
The problematic page:
Default 'winlogbeat.yml' from Elastic is a verbose and lengthy template. Often times not everything is commented out, and simply adding the configuration from steps 2 & 3 conflicts with things hiding somewhere in the noise of the template.
In multiple cases, I've seen index failures in Logz.io because there is a conflicting 'processor' in the configuration, causing the 'agent' field to not be renamed to 'beat_agent'. This conflict is usually something that came from the default values from Elastics 'winlogbeat.yml'.
Given that the complete configuration for shipping with winlogbeat to Logz.io is very brief, maybe providing a complete example of an entire winlogbeat.yml file along with the suggestion to replace the default one entirely could remedy this issue before it occurs.
I usually send the following example configuration to people who have this issue and recommend they replace the default file with it, which has been a fix. https://github.com/zdhamilton/logzio-sample-configs/blob/master/logzio-winlogbeat/winlogbeat.yml
The problematic page: https://github.com/logzio/logz-docs/blob/master/_source/api/logzio-public-api.yml#L3955
The API spec defined for the TimeBasedAccountCreateRequest refers to an "isFlexible" field in the description for the "reservedDailyGB" field. This "isFlexible" field is not itself in the API spec's properties.
Add an "isFlexible" field under the properties
field of TimeBasedAccountCreateRequest.
The problematic page:
https://docs.logz.io/shipping/shippers/logstash.html
The certificate referenced on this page is incorrect. It should be the certificate found in "Step 3" of the following page: https://app.logz.io/#/dashboard/data-sources/logstash-overssl
Please update the cert in the documentation to match the cert that is referenced in the "logstash over ssl" section of "Ship Your Data" in Logz.io
The problematic page: https://docs.logz.io/shipping/log-sources/hashicorp-vault.html
In they very first section, you recommend setting "log_raw" to true, so that Filebeat can read the file. This is not advisable, nor true. If you do let log_raw default to false, secret values are replaced by a string containing an hmac-sha256 hash of that secret value. If users do set this to true, they are logging all secret values in plaintext.
Here is an example against a local dev cluster, which you should be able to replicate:
# you can run the server in another window or just background it
$ vault server -dev -dev-root-token-id=dev &
$ export VAULT_ADDR=http://localhost:8200
$ export VAULT_TOKEN=dev
$ vault audit list
No audit devices are enabled.
$ vault audit enable file file_path=/tmp/logzio.example log_raw=true
Success! Enabled the file audit device at: file/
$ vault write secret/test key="A SECRET VALUE THAT SHOULD NOT BE LOGGED IN RAW"
Success! Data written to: secret/test
$ cat /tmp/logzio.example
{"time":"2020-05-26T18:19:13.991984Z","type":"response","auth":{"client_token":"dev","accessor":"ReoWnERzrXZDShBfghKc2G10","display_name":"token","policies":["root"],"token_policies":["root"],"token_type":"service"},"request":{"id":"546cf1ae-4b9a-b2c2-2675-803f942cf3c1","operation":"update","client_token":"dev","client_token_accessor":"ReoWnERzrXZDShBfghKc2G10","namespace":{"id":"root"},"path":"sys/audit/file","data":{"description":"","local":false,"options":{"file_path":"/tmp/logzio.example","log_raw":"true"},"type":"file"},"remote_address":"127.0.0.1"},"response":{}}
{"time":"2020-05-26T18:19:39.24518Z","type":"request","auth":{"client_token":"dev","accessor":"ReoWnERzrXZDShBfghKc2G10","display_name":"token","policies":["root"],"token_policies":["root"],"token_type":"service"},"request":{"id":"0df16e21-4e22-339b-2db2-a67b1b7345dd","operation":"update","client_token":"dev","client_token_accessor":"ReoWnERzrXZDShBfghKc2G10","namespace":{"id":"root"},"path":"secret/test","data":{"key":"A SECRET VALUE THAT SHOULD NOT BE LOGGED IN RAW"},"remote_address":"127.0.0.1"}}
{"time":"2020-05-26T18:19:39.245431Z","type":"response","auth":{"client_token":"dev","accessor":"ReoWnERzrXZDShBfghKc2G10","display_name":"token","policies":["root"],"token_policies":["root"],"token_type":"service"},"request":{"id":"0df16e21-4e22-339b-2db2-a67b1b7345dd","operation":"update","client_token":"dev","client_token_accessor":"ReoWnERzrXZDShBfghKc2G10","namespace":{"id":"root"},"path":"secret/test","data":{"key":"A SECRET VALUE THAT SHOULD NOT BE LOGGED IN RAW"},"remote_address":"127.0.0.1"},"response":{}}
Notice that the secret I wrote, "A SECRET VALUE THAT SHOULD NOT BE LOGGED IN RAW" is logged directly to the log file. Similarly, my dev root token, "dev" is logged in plain in the "auth.client_token" field.
I believe this is just a misunderstanding of how the audit backend works in Vault, so I would remove setting log_raw to true by default in the documentation. Your example will still be valid/functional, and your users will not be logging sensitive data directly to a file nor transmitting it to your servers. For more context, in order to gain use from your audit log in Vault, you should use /sys/audit-hash to generate the hash for a plaintext value, and search your audit log for that hash.
At the very least, you should add a disclaimer that it is possible sensitive data will be transmitted in plaintext if you set log_raw to true, but this is a less secure default.
The problematic page: https://docs.logz.io/user-guide/accounts/account-region.html
Text currently reads:
"To find your region, sign in to Logz.io and look at the URL in the address bar. If you see app.logz.io, then your account is in the . All other regions have a hypen and then a two-letter region code. For example, if you see app-eu.logz.io, then your account is in the ."
Looks like something broke in some css somewhere - the region names are missing
The problematic page:
This image of how to find your region code here is wrong.
Image link: https://dytvr9ot2sszz.cloudfront.net/logz-docs/distributed-tracing/general-settings1.png
The region code should be taken from here:
https://docs.logz.io/user-guide/accounts/account-region.html#available-regions
Issue resolved in #1215
The problematic page: https://docs.logz.io/shipping/log-sources/stackdriver.html#build-your-pubsub-input-yaml-file
Link is broken:
https://www.elastic.co/guide/en/beats/filebeat/master/filebeat-input-google-pubsub.html#filebeat-input-google-pubsub
Link to correct place
When we add Filebeat Windows instructions, add information on permissions:
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
(requested by Matt)
URL:
https://docs.logz.io/api/index.html#operation/createUser
https://docs.logz.io/api/index.html#operation/updateUser
There's an error with the roles field description.
Role 2 is for user access and role 3 is for admin access.
The problematic page: https://docs.logz.io/user-guide/distributed-tracing/k8s-deployment
Errors when applying the configuration
The apiVersion extensions/v1beta1 needs to be changed to apps/v1
2nd need to add selector block in the deployments config spec section
e.g.
selector:
matchLabels:
app: jaeger
app.kubernetes.io/name: jaeger
app.kubernetes.io/component: agent
The problematic page:
https://docs.logz.io/shipping/log-sources/docker.html#run-the-docker-image
Not enough information for proper running docker container. Missed required param LOGZIO_URL.
In the docker run script missed LOGZIO_URL variable ... --env LOGZIO_URL="<LISTENER-URL>:5015" ...
, according to logzio/docker-collector-logs page on Docker Hub.
Before fix:
docker run --name docker-collector-logs \
--env LOGZIO_TOKEN="<<LOG-SHIPPING-TOKEN>>" \
-v /var/run/docker.sock:/var/run/docker.sock:ro \
-v /var/lib/docker/containers:/var/lib/docker/containers \
logzio/docker-collector-logs
Add --env LOGZIO_URL="<LISTENER-URL>:5015"
After fix:
docker run --name docker-collector-logs \
--env LOGZIO_TOKEN="<ACCOUNT-TOKEN>" \
--env LOGZIO_URL="<LISTENER-URL>:5015" \
-v /var/run/docker.sock:/var/run/docker.sock:ro \
-v /var/lib/docker/containers:/var/lib/docker/containers \
logzio/docker-collector-logs
The Slack button is on most of the pages.
Whenever I try to use the Slack button I have to fill out the "I am not robot!" where is appearing in a small box that I cannot fill it correctly. It's like:
My browser is Firefox 80.0.1.
DOMPurify is a DOM-only, super-fast, uber-tolerant XSS sanitizer for HTML, MathML and SVG. It's written in JavaScript and works in all modern browsers (Safari, Opera (15+), Internet Explorer (10+), Firefox and Chrome - as well as almost anything else usin
Library home page: https://registry.npmjs.org/dompurify/-/dompurify-1.0.10.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/dompurify/package.json
Dependency Hierarchy:
Found in HEAD commit: fb7bb0f50f982132eca8f7a88b515f9578dea8e0
DOMPurify before 2.0.1 allows XSS because of innerHTML mutation XSS (mXSS) for an SVG element or a MATH element, as demonstrated by Chrome and Safari.
Publish Date: 2019-09-24
URL: CVE-2019-16728
Base Score Metrics:
Type: Upgrade version
Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16728
Release Date: 2019-09-24
Fix Resolution: 2.0.1
The problematic page: https://docs.logz.io/shipping/log-sources/stackdriver.html
It's unclear if this docker container logzio/logzio-pubsub
has to run forever or if it's a one off task?
In the docs it says:
"You can use Google Cloud Pub/Sub to forward your logs from Stackdriver to Logz.io."
Will Pub/Sub forward to logz.io directly or is the docker container involved in that?
Describe what the docker container is for.
A markdown parser built for speed
Library home page: https://registry.npmjs.org/marked/-/marked-0.6.2.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/marked/package.json
Dependency Hierarchy:
Found in HEAD commit: fb7bb0f50f982132eca8f7a88b515f9578dea8e0
marked before 0.7.0 vulnerable to Redos attack by he _label subrule that may significantly degrade parsing performance of malformed input.
Publish Date: 2019-09-05
URL: WS-2019-0209
Type: Upgrade version
Origin: https://www.npmjs.com/advisories/1076
Release Date: 2019-09-05
Fix Resolution: 0.7.0
The problematic page: docs.logz.io/api
Today, we say "Account administrator's email address" - this is incomplete.
Update to include this information:
Hi there, basically I just tried out all the code samples on nodejs
and none of them worked. I tried both http
and https
but no luck.
I just copied and pasted sample codes from the official docs, (of course with my personal token and etc):
https://docs.logz.io/shipping/log-sources/nodejs.html
https://www.npmjs.com/package/winston-logzio
however, it never works. I have been trying to debug on this over 2 weeks. I can even provide my token if you need that. I checked the config line by line but without a working sample, I really had no idea where was wrong.
Can you please help me out here? I think logzio is a great product and I'd like to give it a try, but with non of working codes, I'm at the edge of giving up and go to other similar product. It's really frustrating.
Here is my simple code (copied directly from docs above):
const winston = require('winston');
const LogzioWinstonTransport = require('winston-logzio');
const logzioWinstonTransport = new LogzioWinstonTransport({
level: 'info',
name: 'winston_logzio',
token: '************************',
host: 'app.logz.io',
});
const logger = winston.createLogger({
format: winston.format.simple(),
transports: [logzioWinstonTransport],
});
// setInterval(() => {
// logger.log('warn', 'Just a test message');
// console.log(11);
// }, 1000);
var logger2 = require('logzio-nodejs').createLogger({
token: '************************',
protocol: 'https',
host: 'app.logz.io',
port: '8071',
type: 'YourLogType'
});
setInterval(() => {
logger2.log('This is a log message');
console.log(11);
}, 1000);
Here is the error message:
logzio-logger error: Error: Failed after 3 retries on error = RequestError: Error: connect ETIMEDOUT 104.124.1.33:8071 Error: Failed after 3 retries on error = RequestError: Error: connect ETIMEDOUT 104.124.1.33:8071
at /Users/xxxxx/Sites/xxx/xxx-library/node_modules/logzio-nodejs/lib/logzio-nodejs.js:294:46
at tryCatcher (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/util.js:16:23)
at Promise._settlePromiseFromHandler (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/promise.js:547:31)
at Promise._settlePromise (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/promise.js:604:18)
at Promise._settlePromise0 (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/promise.js:649:10)
at Promise._settlePromises (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/promise.js:725:18)
at _drainQueueStep (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/async.js:93:12)
at _drainQueue (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/async.js:86:9)
at Async._drainQueues (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/async.js:102:5)
at Immediate.Async.drainQueues [as _onImmediate] (/Users/xxxxx/Sites/xxx/xxx-library/node_modules/bluebird/js/release/async.js:15:14)
at proces
Can you please tell me where is wrong? Also, if possible, I'd like to see a working sample in the real world as I didn't find any resource about it.
node -v
: v14.8.0
"express": "^4.17.1",
"logzio-nodejs": "^2.0.2",
"winston": "^3.3.3",
"winston-logzio": "^5.1.2"
Thank you so much.
We are using your enterprise account. When we share the snapshot as public Long URL it works. However, when we use public Short-URL, the Dashboard permalink generated with "Saved as Snapshot" or "Saved as Object" gives a 02 - Bad Gateway.
Only For enterprise users for the account, we have SSO login through which the public Short URL works.
Image will be updated in #1211
The problematic page: https://docs.logz.io/user-guide/lookups/
In the following image:
https://dytvr9ot2sszz.cloudfront.net/logz-docs/siem-lookups/add-record-lookup.png
The name of the table is "IP ranges table". But the actual values are specific IPs, not ranges. IP ranges are a specific format - it looks like this "192.168.0.0 – 192.168.255.255", or also in CIDR format: 192.168.0.0/16
This is confusing as it sort of hints that lookup lists can match IPs in an IP range. And as far as I can tell - they can't.
The problematic page:
JavaScript library for DOM operations
Library home page: https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js
Path to vulnerable library: /logz-docs/_source/js/jquery-3.3.1.min.js
Dependency Hierarchy:
Found in HEAD commit: 216bfec304dd1b51e96e55cb9d31be55a01b656b
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable proto property, it could extend the native Object.prototype.
Publish Date: 2019-04-20
URL: CVE-2019-11358
Base Score Metrics:
Type: Upgrade version
Origin: https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358
Release Date: 2019-04-20
Fix Resolution: 3.4.0
Step up your Open Source Security Game with WhiteSource here
The problematic page:
Would it make sense to have documentation on the different actions in the account an admin user can do versus a regular user? Apart from tagging features with "admin"
The problematic page:
Docs page: https://docs.logz.io/shipping/log-sources/guardduty.html
In the table with the required environment variables, for "URL" is gives you the list of Logz.io listeners by region from a hyperlink. The formatted string from the the Kinesis Lambda's Python is: "logzio_url = "{0}/?token={1}".format(os.environ['URL'], os.environ['TOKEN'])". In the case that someone only provides "listener.logz.io" (for example) as it's displayed in the list of listeners by region for the URL environment variable, the endpoint for the Lambda will be missing port 8071.
Python Lambda: https://github.com/logzio/logzio_aws_serverless/blob/master/python3/kinesis/src/lambda_function.py
Adding something that made it clear that you need the value of the URL variable to be the listener URL and port 8071, such as "find your listener URL from the following hyperlink, and then format it as such: <LISTENER_URL>:8071" would hopefully make the logzio_url variable in the python script not fail.
testing slack integration
The problematic page:
Not sure if this is necessary, but along with "add_field { token => ... }" supposed to also include "type => ..."? I don't see type anywhere. Not sure if type is necessary here but not sure how else we would get it.
Change:
add_field { token => <<SHIPPING_TOKEN>> }
to
add_field { token => <<SHIPPING_TOKEN>>
type => <<LOG_TYPE>> }
Hello
We are using your enterprise account. While we share the public link on our Logz Dashboards, stakeholders are not able to search data using JQLs with the shared link. Not sure if this is the way its configured. Could you help us on this?
Regards,
Aniruddha
The problematic page:
https://docs.logz.io/user-guide/integrations/custom-endpoints.html
The sentence "The Logz.io IP range is used by all customers and should be treated with caution." appears twice.
Fixed in PR711: - https://deploy-preview-711--logz-docs.netlify.app/user-guide/integrations/custom-endpoints.html
The problematic page:
In the API doc, we show email as part of response to https://docs.logz.io/api/#tag/Manage-sub-accounts
Actually, we return null.
Change response from email string => null
parse argument options
Library home page: https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/minimist/package.json
Dependency Hierarchy:
Found in HEAD commit: 3dbcb99f96f16bfbb3d428ec00eeca5eab585b8b
minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "proto" payload.
Publish Date: 2020-03-11
URL: CVE-2020-7598
Base Score Metrics:
Type: Upgrade version
Origin: https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94
Release Date: 2020-03-11
Fix Resolution: minimist - 0.2.1,1.2.2
The problematic page:
https://docs.logz.io/user-guide/archive-and-restore/
Current text:
Each account (or sub-account) should archive to a separate S3 bucket.
If the restore process exceeds the max, the process will fail.
The max data to restore is equivalent to your account’s daily reserved volume, and no more than 100 GB.
Restore processes are capped at 100 GB. This maximum applies to accounts with a daily reserved volume greater than 100 GB.
The text here seems confusing. I am honestly not sure what are the limitations here.
logz-docs/_source/api/logzio-public-api.yml
Line 731 in e6a5b46
The problematic page: https://docs.logz.io/user-guide/accounts/account-region.html
Current text says: "You can find your account’s region by selecting > Settings > General from the top menu."
But no such information is available.
see:
logz-docs/_source/api/logzio-public-api.yml
Lines 704 to 731 in 622a5df
should contain a payload of:
{
"query": {
"bool": {
"must": [{
"range": {
"@timestamp": {
"gte": "now-5m",
"lte": "now"
}
}
}]
}
},
"from": 10,
"size": 50,
"sort": [{}],
"_source": {
"includes": "message"
},
"post_filter": {},
"docvalue_fields": {},
"version": {},
"store": null,
"highlight": {},
"aggregations": {
"byType": {
"terms": {
"field": "type",
"size": 5
}
}
}
}
Make a Docker container for easier local testing
Nomenclature!
Is the SIEM log data coming from Microsoft 365 or Office 365? There is a difference.
If the integration is valid for both, the logo text and the topic should clarify this.
Also, use of the abbreviation "M365" is not Microsoft standard, and may be ambiguous. Should be corrected throughout the topic.
The problematic page: Ship logs from Microsoft Office 365
The problematic page:
Given what we have right now, we don't have people run "install-service-winlogbeat.ps1" which comes with winlogbeat when you down load it. Since we're telling people to run it as a service from the command prompt, we should have them create the executable as a service first. If they don't, "Restart-Service winlogbeat" or "Start-Service winlogbeat" wont work.
An additional caveat. On some machines, it wont allow you to run "install-service-winlogbeat.ps1" since it is unsigned. The command: "PowerShell.exe -ExecutionPolicy UnRestricted -File .\install-service-winlogbeat.ps1" has helped me bypass this in the past.
I would just add those as gotcha's or potential steps in the docs.
The problematic page: https://docs.logz.io/user-guide/accounts/account-region.html#available-regions
There is no region code for us-east, is that ok?
Add the region code or explain why it doesn't exist at the table
The problematic page:
https://docs.logz.io/shipping/log-sources/java.html#logback-config
Incorrect maven version
1.0.25
Correct maven version
v1.0.25
https://mvnrepository.com/artifact/io.logz.logback/logzio-logback-appender/v1.0.25
The problematic page: https://docs.logz.io/shipping/log-sources/kubernetes.html
Issue is described best here: https://kubernetes.io/blog/2019/07/18/api-deprecations-in-1-16/
Ran into this when trying to ship Kubernetes logs on the newest version. I had to make a couple small changes the logzio-daemonset.yml
file to successfully ship.
apiVersion: extensions/v1beta1
to apiVersion: apps/v1
and add the following object to spec
:
selector:
matchLabels:
k8s-app: fluentd-logzio
version: v1
Get the native type of a value.
Library home page: https://registry.npmjs.org/kind-of/-/kind-of-4.0.0.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/has-values/node_modules/kind-of/package.json
Dependency Hierarchy:
Get the native type of a value.
Library home page: https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/kind-of/package.json
Dependency Hierarchy:
Get the native type of a value.
Library home page: https://registry.npmjs.org/kind-of/-/kind-of-3.2.2.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/snapdragon-util/node_modules/kind-of/package.json
Dependency Hierarchy:
Get the native type of a value.
Library home page: https://registry.npmjs.org/kind-of/-/kind-of-5.1.0.tgz
Path to dependency file: /tmp/ws-scm/logz-docs/package.json
Path to vulnerable library: /tmp/ws-scm/logz-docs/node_modules/default-compare/node_modules/kind-of/package.json
Dependency Hierarchy:
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
Publish Date: 2019-12-30
URL: CVE-2019-20149
Resolved in PR #1220
The problematic page:
https://docs.logz.io/user-guide/integrations/custom-endpoints.html
Custom notifications are sent from logz.io servers, with the actual request originating from one of our IP addresses. A secure integration must open their firewalls to allow traffic from these IPs. The list of IPs should be available (or linked) from this page.
Add the list of source IP addresses, or link to where they can be found.
S3 Archiving page is out of date
The problematic page:
The above approach shows how to set this up with access key/secret key. Our platform recommends using roles now.
https://docs.logz.io/user-guide/give-aws-access-with-iam-roles/
This page is unclear. Step 1 "Copy Logz.io details" needs an 'Account ID'. Presumably this is Logz.io's AWS Account ID which I need to paste into the role creation in my own AWS account in step 2 ("Another AWS Account" -> "Account ID"). But where do I get the Logz.io AWS Account ID from? I can't find it anywhere.
Explain where to get the Logz.io AWS Account ID, or, if that's not what is required, which piece of information is required.
The problematic page:
https://github.com/logzio/logz-docs/blob/master/_source/api/logzio-public-api.yml#L364
https://github.com/logzio/logz-docs/blob/master/_source/api/logzio-public-api.yml#L3236
https://github.com/logzio/logz-docs/blob/master/_source/api/logzio-public-api.yml#L448
I am attempting to generate a golang client of your API, using https://github.com/go-swagger/go-swagger v0.27.0.
When running the command
$ swagger generate client -f ./logzio_api.yml --template=stratoscale
I get the following spec validation errors, and am unable to generate a golang client.
(The 'stratoscale' template style is being used so that the generated client returns errors instead of panicing on errors, and it allows for easier generation of mocks, because of the way that endpoints are encoded.)
The swagger spec at "...../logzio_api.yml" is invalid against swagger specification 2.0. see errors :
- "paths./v2/security/rules/search.post.parameters" must validate one and only one schema (oneOf). Found none valid
- paths./v2/security/rules/search.post.parameters.schema.oneOf in body is a forbidden property
- "paths./v1/alerts/triggered-alerts.post.parameters" must validate one and only one schema (oneOf). Found none valid
- paths./v1/alerts/triggered-alerts.post.parameters.in in body should be one of [header]
- paths./v1/alerts/triggered-alerts.post.parameters.in in body should be one of [body]
- "paths./v1/account-management/time-based-accounts/{id}.put.parameters" must validate one and only one schema (oneOf). Found none valid
- paths./v1/account-management/time-based-accounts/{id}.put.parameters.schema.oneOf in body is a forbidden property
- "paths./v1/account-management/time-based-accounts.post.parameters" must validate one and only one schema (oneOf). Found none valid
- paths./v1/account-management/time-based-accounts.post.parameters.schema.oneOf in body is a forbidden property
The issue seems to stemming from the usage of the
parameters:
- in: body
name: foo
schema:
oneOf:
- $ref: '#/definitions/A'
- $ref: '#/definitions/B'
schema style, where the 'oneOf' is not valid OpenAPI 2.0 spec.
The problematic page: https://docs.logz.io/shipping/log-sources/jenkins.html
Missing the port number in the listener URL
Add port number after shipping URL
The problematic page:
The problematic page:
We're missing the documentation for setting this up with an IAM role configuration.
We should add it since I believe it's the recommended approach now.
testing integration
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.