Code Monkey home page Code Monkey logo

Comments (13)

JordanP avatar JordanP commented on May 25, 2024 3

For the record, I managed to have log insights working in a Kubernetes cluster, with what you suggested: Run the pganalyze-collector in a Docker container. Have the Postgres container write the log output into a file that sits on a shared volume that the pganalyze-collector container has access to. Then use the LOG_LOCATION setting to specify that log file.

All good, thanks !

from collector.

rauanmayemir avatar rauanmayemir commented on May 25, 2024 1

Is there any follow-up on this?

I still think that it would be very convenient to have a way for kubernetes-managed log collectors like fluentd or vector push logs into pganalyze (whether to the collector or directly to the centralized ingest endpoint).

from collector.

lfittl avatar lfittl commented on May 25, 2024 1

See #503 for an implementation that should work in the typical Kubernetes environment using fluentbit for log routing (tested with CloudNativePG, need to do a bit more testing with other operators).

from collector.

JordanP avatar JordanP commented on May 25, 2024

I have the same question: how does log collections work under Kubernetes ?

from collector.

lfittl avatar lfittl commented on May 25, 2024

@JordanP We don't support log collection in Kubernetes today - though looking to add this in the future.

For @sebasmagri's initial question - Sebastian, were you looking to monitor Kubernetes pods, or were you targeting something external, like a cloud provider database?

(generally log collection is enabled by default in recent collector releases - if supported for the current platform, that is)

from collector.

sebasmagri avatar sebasmagri commented on May 25, 2024

@lfittl it's for running the collector on ECS and monitoring RDS. We had to go with a custom docker image, but it's still pretty annoying that now the server names are forced to be the rds instance ID, since we create a lot of RDS instances dynamically and discard the old ones, being really a single server instance all the time.

from collector.

lfittl avatar lfittl commented on May 25, 2024

@sebasmagri Ah - makes sense. You can use the following settings to override the system identifiers:

PGA_API_SYSTEM_ID
PGA_API_SYSTEM_TYPE
PGA_API_SYSTEM_SCOPE

You can see the defaults that get assigned here: https://github.com/pganalyze/collector/blob/master/config/identify_system.go#L15 (note you could e.g. choose to override PGA_API_SYSTEM_ID in your setup to have a stable server record)

from collector.

sebasmagri avatar sebasmagri commented on May 25, 2024

Thanks for the pointer @lfittl !

We'd still like to be able to use the docker image as provided though, any chance that logging collection can be enabled by default for this use case?

from collector.

lfittl avatar lfittl commented on May 25, 2024

@sebasmagri In the case where you are not using the instance ID in the hostname, utilizing the AWS_INSTANCE_ID variable should work (that, or the hostname, is what the collector checks against to determine which code path to take)

from collector.

JordanP avatar JordanP commented on May 25, 2024

@lfittl beside Kubernetes, does the Collector supports monitoring a PG instance running in a container ?

from collector.

lfittl avatar lfittl commented on May 25, 2024

@JordanP There are two ways to do that, assuming you are referring to Postgres running inside a Docker container on a VM that you manage:

  1. Run the pganalyze-collector on the Docker host, and use the db_log_docker_tail setting - this will directly fetch the logs from the Docker engine. Note that this setting is mostly intended for our own development purposes, and is considered experimental.

  2. Run the pganalyze-collector in a Docker container. Have the Postgres container write the log output into a file that sits on a shared volume that the pganalyze-collector container has access to. Then use the LOG_LOCATION setting to specify that log file.

from collector.

JordanP avatar JordanP commented on May 25, 2024

Excellent ! Option 2 seems doable for us, nice !

from collector.

ianstanton avatar ianstanton commented on May 25, 2024

@lfittl How realistic would it be to allow for pganalyze-collector to stream logs from a web socket? We're in a similar situation here, and our logs end up in Loki. We read these logs in other areas of our application via web socket.

from collector.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.