Enable Slow Query Logging in RDS instance

To Nha Notes | Nov. 27, 2024, 1:53 p.m.

  • Modify the Parameter Group:

    • Go to your RDS Parameter Groups in the AWS Console.
    • Modify the parameters:
      • slow_query_log = 1 (Enabled)
      • long_query_time = <desired_threshold_in_seconds> (e.g., 1 for queries taking longer than 1 second)
      • log_output = TABLE (or FILE if you prefer file-based logging)
  • Apply Parameter Group:

    • Associate the parameter group with your RDS instance and reboot it if necessary.
  • Verify Logs:

    • Use the query below to check if slow queries are being logged:

      SELECT * FROM mysql.slow_log LIMIT 10;

  • Export CloudWatch Logs to S3:
    aws logs create-export-task \ --task-name "ExportSlowQueryLogs" \ --log-group-name "/aws/rds/instance/your-instance-name/slowquery" \ --from <start-time-in-epoch-ms> \ --to <end-time-in-epoch-ms> \ --destination "your-s3-bucket-name" \ --destination-prefix "rds-slow-query-logs/"

  • Download slow logs

aws rds download-db-log-file-portion \
  --db-instance-identifier <db-instance-name> \
  --log-file-name slowquery/mysql-slowquery.log \
  --starting-token <marker-token> \
  --output text >> mysql-slowquery.log

  • Automate the Entire Process

​​​​​​​Use Amazon EventBridge to schedule the Lambda function to run periodically (e.g., daily or hourly).

import boto3

def handler(event, context):
    rds_client = boto3.client('rds')
    s3_client = boto3.client('s3')

    db_instance_id = "your-instance-id"
    log_file_name = "slowquery/mysql-slowquery.log"
    bucket_name = "your-bucket-name"
    s3_key = "logs/mysql-slowquery.log"

    # Download log file portion
    response = rds_client.download_db_log_file_portion(
        DBInstanceIdentifier=db_instance_id,
        LogFileName=log_file_name,
        Marker="0"
    )
    log_content = response['LogFileData']

    # Upload to S3
    s3_client.put_object(
        Bucket=bucket_name,
        Key=s3_key,
        Body=log_content
    )

    return {"status": "success"}