yokawasa / azure-log-analytics-data-collector Goto Github PK
View Code? Open in Web Editor NEWAzure Log Analytics Data Collector API Client libraries
License: MIT License
Azure Log Analytics Data Collector API Client libraries
License: MIT License
This is needed to use Ressource Based Permissions.
Without this it is not possible to restict custom log access
If implemented, its also needed in https://github.com/yokawasa/fluent-plugin-azure-loganalytics
res
can be nil
here, leading to a scenario where the following is logged from the logstash-output-azure_loganalytics plugin when Azure::Loganalytics::Datacollectorapi::Client.is_success(res)
is called and is_success attempts to evaluate res.code
[ERROR][logstash.outputs.azureloganalytics][pipeline-common][pipeline_common_output_azure_log_analytics_ingest] Exception occured in posting to DataCollector API as log type ExampleLogType: 'undefined method `code' for nil:NilClass', data=>[{"original_timestamp":"2021-04-15T10:36:21.315Z","event_uuid":"ede25ed2-0d3e-4898-8c5d-3...
There appears to be a related issue logged at rest-client/rest-client#655
Exception reported from @jahidakhtargit
I'm getting below exception while creating the signature.
string argument without an encoding
As said here, Log Analytics Data Collector API's response code has changed. It's no longer 202 but 200 now for accepted file response. So the libraries' helper method 'is_success' also need to be modifed
MicrosoftDocs/azure-docs@1fbe1d2#diff-028a3692a6defd9d1b4b700d33903102
moderate severity
Vulnerable versions: <= 12.3.2
Patched version: 12.3.3
here is an OS command injection vulnerability in Ruby Rake before 12.3.3 in Rake::FileList when supplying a filename that begins with the pipe character |.
I able to write the log data into the azure log analytics but in this Repo having the post method only since we don't have read method.
According to the DataCollector API documentation for the log type can only contain letters, numbers, and the underscore character, and may not exceed 100 characters. However the client.py code is testing for isalpha()
which only checks for alphabet characters which excludes underscores. Could you change the test to include '_' and test the length?
Add body size check not to exceed max of 30 MB
Maximum of 30 MB per post to Azure Monitor Data Collector API. This is a size limit for a single post. If the data from a single post that exceeds 30 MB, you should split the data up to smaller sized chunks and send them concurrently.
Relevant issue
Hi there,
The collector works great, at least on my computer it does. The problem is that our production environment requires me to use a proxy. I have not yet been able to get this working.
I've tried hacking it into net/http/persistent.rb (by setting proxy=:ENV) but this does not seem to work. I'm not even sure this is used by the collector.
Any information that would point me in the right direction is appreciated!
hi, thank you for creating & maintaining useful Plugin!
We are using this product via LogStash plugin.
https://github.com/yokawasa/azure-log-analytics-data-collector
I believe this is not LogStash Plugins issue but this repository issue.
so let me raise retry issue here. Please correct me and this issue if it is wrong.
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
According to manual, they require client to retry when we following error codes.
Code | Status | Error code | Description |
---|---|---|---|
429 | Too Many Requests | ย | The service is experiencing a high volume of data from your account. Please retry the request later. |
500 | Internal Server Error | UnspecifiedError | The service encountered an internal error. Please retry the request. |
503 | Service Unavailable | ServiceUnavailable | The service currently is unavailable to receive requests. Please retry your request. |
But currently, there is no check and because of rest-client specification, a specific exception class will be thrown when we receive these error code.
https://github.com/yokawasa/azure-log-analytics-data-collector/blob/master/lib/azure/loganalytics/datacollectorapi/client.rb#L43
but as we can imagine easily, retry function may cause not good effect like consuming memory, stacking thread.. etc and that should be configured for each purpose.
so how you think we should implement retry function for this HTTP Data collector?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.