LogoLogo
Log in|Playground
  • Welcome
    • Introduction
    • FAQ
  • Capabilities
    • Log Management
    • Infrastructure Monitoring
    • Application Performance Monitoring (APM)
      • Application Metrics
      • Traces
      • Supported Technologies
    • Real User Monitoring (RUM)
  • Getting Started
    • Requirements
      • Kubernetes requirements
      • Kernel requirements for eBPF sensor
      • CPU architectures
      • ClickHouse resources
    • Installation & updating
    • Connect Linux hosts
    • Connect RUM
    • 5 quick steps to get you started
    • groundcover MCP
      • Configure groundcover's MCP Server
      • Getting-started Prompts
      • Real-world Use Cases
  • Use groundcover
    • Monitors
      • Create a new Monitor
      • Issues page
      • Monitor List page
      • Silences page
      • Monitor Catalog page
      • Monitor YAML structure
      • Embedded Grafana Alerts
        • Create a Grafana alert
    • Dashboards
      • Create a dashboard
      • Embedded Grafana Dashboards
        • Create a Grafana dashboard
        • Build alerts & dashboards with Grafana Terraform provider
        • Using groundcover datasources in a Self-hosted Grafana
    • Insights
    • Explore & Monitors query builder
    • Workflows
      • Create a new Workflow
      • Workflow Examples
      • Alert Structure
    • Search & Filter
    • Issues
    • Role-Based Access Control (RBAC)
    • Service Accounts
    • API Keys
    • APIs
    • Log Patterns
    • Drilldown
    • Scraping custom metrics
      • Operator based metrics
      • kube-state-metrics
      • cadvisor metrics
    • Backup & Restore Metrics
    • Metrics & Labels
    • Add custom environment labels
    • Configuring Pipelines
      • Writing Remap Transforms
      • Logs Pipeline Examples
      • Traces Pipeline Examples
      • Logs to Events Pipeline Examples
      • Logs/Traces Sensitive Data Obfuscation
      • Sensitive Data Obfuscation using OTTL
      • Log Filtering using OTTL
    • Querying your groundcover data
      • Query your logs
        • Example queries
        • Logs alerting
      • Query your metrics
      • Querying you data using an API
      • Using KEDA autoscaler with groundcover
  • Log Parsing with OpenTelemetry Pipelines
  • Log and Trace Correlation
  • RUM
  • Customization
    • Customize deployment
      • Agents in host network mode
      • API Key Secret
      • Argo CD
      • On-premise deployment
      • Quay.io registry
      • Configuring sensor deployment coverage
      • Enabling SSL Tracing in Java Applications
    • Customize usage
      • Filtering Kubernetes entities
      • Custom data retention
      • Sensitive data obfuscation
      • Custom storage
      • Custom logs collection
      • Custom labels and annotations
        • Enrich logs and traces with pod labels & annotations
        • Enrich metrics with node labels
      • Disable tracing for specific protocols
      • Tuning resources
      • Controlling the eBPF sampling mechanism
  • Integrations
    • Overview
    • Workflow Integrations
      • Slack Webhook Integration
      • Opsgenie Integration
      • Webhook Integration
        • Incident.io
      • PagerDuty Integration
      • Jira Webhook Integration
      • Send groundcover Alerts to Email via Zapier
    • Data sources
      • OpenTelemetry
        • Traces & Logs
        • Metrics
      • Istio
      • AWS
        • Ingest CloudWatch Metrics
        • Ingest CloudWatch Logs
        • Ingest Logs Stored on S3
        • Integrate CloudWatch Grafana Datasource
      • GCP
        • Ingest Google Cloud Monitoring Metrics
        • Stream Logs using Pub/Sub
        • Integrate Google Cloud Monitoring Grafana Datasource
      • Azure
        • Ingest Azure Monitor Metrics
      • DataDog
        • Traces
        • Metrics
      • FluentBit
      • Fluentd
      • JSON Logs
    • 3rd-party metrics
      • ActiveMQ
      • Aerospike
      • Cassandra
      • CloudFlare
      • Consul
      • CoreDNS
      • Etcd
      • HAProxy
      • Harbor
      • JMeter
      • K6
      • Loki
      • Nginx
      • Pi-hole
      • Postfix
      • RabbitMQ
      • Redpanda
      • SNMP
      • Solr
      • Tomcat
      • Traefik
      • Varnish
      • Vertica
      • Zabbix
    • Source control (Gitlab/Github)
  • Architecture
    • Overview
    • inCloud Managed
      • Setup inCloud Managed with AWS
        • AWS PrivateLink Setup
        • EKS add-on
      • Setup inCloud Managed with GCP
      • Setup inCloud Managed with Azure
      • High Availability
      • Disaster Recovery
      • Ingestion Endpoints
      • Deploying in Sensor-Only mode
    • Security considerations
      • Okta SSO - onboarding
    • Service endpoints inside the cluster
  • Product Updates
    • What's new?
    • Earlier updates
      • 2025
        • Mar 2025
        • Feb 2025
        • Jan 2025
      • 2024
        • Dec 2024
        • Nov 2024
        • Oct 2024
        • Sep 2024
        • Aug 2024
        • July 2024
        • May 2024
        • Apr 2024
        • Mar 2024
        • Feb 2024
        • Jan 2024
      • 2023
        • Dec 2023
        • Nov 2023
        • Oct 2023
Powered by GitBook
On this page
  • Setup a Firehose stream
  • Create an IAM role and policy
  • Create a subscription filter
Export as PDF
  1. Integrations
  2. Data sources
  3. AWS

Ingest CloudWatch Logs

Last updated 3 months ago

This feature is only available for enterprise plan and, at this stage, requires public ingress endpoint enabled.

groundcover allows ingesting logs from CloudWatch by setting up an

Setup a Firehose stream

  1. Go to .

  2. Click on Create Firehose stream

    1. Source: Direct PUT

    2. Destination: HTTP Endpoint

    3. Create a name for your stream, for example PUT-Groundcover-logs

    4. Destination settings:

      1. HTTP endpoint URL: Firehose logs endpoint, fetched using

      2. Access key: groundcover API token, fetching using

      3. Content encoding: GZIP

      4. Parameters:

        1. env_name - Specify your Environment name, it will show up in this environment in the application

    5. Backup settings:

      1. Choose a backup bucket, or create a new one.

  3. Click Create Firehose stream

Create an IAM role and policy

  1. Click on Roles in the side bar

  2. Click on Create Role

    1. Select Custom trust policy

    2. Paste the following policy:

      {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Sid": "Statement1",
            "Effect": "Allow",
            "Principal": {
              "Service": "logs.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
          }
        ]
      }
    3. Click on Next twice (we'll attach permissions later)

    4. Provide a name for the role

    5. Click on Create Role

  3. Go to your newly created role

    1. In the Permissions section, click on Add permissions and then Create inline policy

    2. Click on JSON and paste the following:

      {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Action": [
              "firehose:PutRecord",
              "firehose:PutRecordBatch"
            ],
            "Resource": "<YOUR_FIREHOSE_STREAM_ARN>"
          }
        ]
      }
    3. Click on Next

    4. Give the policy a name

    5. Click on Create Policy

Create a subscription filter

Now that we're all set up, we can add a subscription filter to the desired log group in CloudWatch. Using CLI

The following is an example of how to create a subscription filter through the AWS CLI:

  aws logs put-subscription-filter \
    --log-group-name "<GROUPONAME>" \
    --filter-name "<FILTERNAME>" \
    --filter-pattern "" \
    --destination-arn "<DESTINATIONARN>" \
    --role-arn "<ROLEARN>"

Using AWS Console

    • Click on Create

    • Select Create Amazon Data Firehose subscription filter

  1. Select the Firehose delivery stream created in the previous steps, as well as the IAM role.

  2. Fill Configure log format and filters as you need.

  3. Choose a name for the subscription filter, then click Start streaming.

Go to

Go to the specific log group in and click on the Subscription filters tab.

Amazon IAM
CloudWatch
Amazon Firehose stream.
Amazon Data Firehose
these docs
these docs