Loading

Custom Logs Package

<div class="condensed-table">
| | |
| --- | --- |
| Version | 2.3.3 (View all) |
| Compatible Kibana version(s) | 8.8.0 or higher |
| Supported Serverless project types
What’s this? | Security
Observability |
| Subscription level
What’s this? | Basic |
| Level of support
What’s this? | Elastic |

</div>
The Custom Logs package is used to ingest arbitrary log files and parse their contents using Ingest Pipelines. Follow the steps below to set up and use this package.

  1. Install Elastic Agent Install an Elastic Agent on the machine from which you want to collect logs.

  2. Identify the Log Location Identify the log location on that machine, for example, /tmp/custom.log.

    • If you need to include multiple log files or an entire directory, consider using wildcard patterns such as /tmp/*.log to capture all .log files, or /tmp/* to include all file types.
    • Note that the System integration ingests /var/log/*.log. You do not need to add this path if the System integration is in use.
  3. Enroll the Custom Logs Integration

    • Add the Custom Logs integration to your installed Elastic Agent.
    • Provide an Integration name. A descriptive name will make managing this integration in the Kibana UI more intuitive.
    • Configure the path to match the location(s) identified in the previous step.
    • Provide a dataset name that reflects the purpose of your logs (for example, python for Python application logs).
  4. Verify Data in Discover

    • Open Discover in Kibana and filter the logs-* indices to your dataset name (e.g., logs-python) to confirm that the raw log data is being ingested.
  5. Configure Parsing Rules

  6. Create a Custom Dashboard

    • Use Kibana to build a dashboard for analyzing incoming log data based on your specific needs.

This integration includes an ECS Dynamic Template, so any fields following the ECS schema will automatically receive the correct index field mappings without additional manual configuration.