This project creates an AWS Lambda function that sends logs from files stored in S3 bucket, to Logz.io
To deploy this project, click the button that matches the region you wish to deploy your Stack to:
Keep the default setting in the Create stack screen and select Next.
Specify the stack details as per the table below and select Next.
Parameter | Description | Required/Default |
---|---|---|
bucketName |
Name of the bucket you wish to fetch logs from. Will be used for IAM policy. | Required |
logzioListener |
The Logz.io listener URL fot your region. (For more details, see the regions page | Required |
logzioToken |
Your Logz.io log shipping token. | Required |
logLevel |
Log level for the Lambda function. Can be one of: debug , info , warn , error , fatal , panic . |
Default: info |
logType |
The log type you'll use with this Lambda. This is shown in your logs under the type field in Kibana. Logz.io applies parsing based on type. | Default: s3_hook |
Specify the Key and Value parameters for the Tags (optional) and select Next.
Confirm that you acknowledge that AWS CloudFormation might create IAM resources and select Create stack.
Give the stack a few minutes to be deployed.
Once your Lambda function is ready, you'll need to manually add a trigger. This is due to Cloudformation limitations.
Go to the function's page, and click on Add trigger.
Then, choose S3 as a trigger, and fill in:
- Bucket: Your bucket name.
- Event type: Choose option
All object create events
. - Prefix and Suffix should be left empty.
Confirm the checkbox, and click *Add.
That's it. Your function is configured. Once you upload new files to your bucket, it will trigger the function, and the logs will be sent to your Logz.io account.
- 0.0.2:
- Bug fix: Decodes folder names, for folders with special characters.
- 0.0.1: Initial release.