To Nha Notes | Nov. 27, 2024, 1:53 p.m.
Modify the Parameter Group:
Apply Parameter Group:
Verify Logs:
SELECT * FROM mysql.slow_log LIMIT 10;
Export CloudWatch Logs to S3:
aws logs create-export-task \ --task-name "ExportSlowQueryLogs" \ --log-group-name "/aws/rds/instance/your-instance-name/slowquery" \ --from <start-time-in-epoch-ms> \ --to <end-time-in-epoch-ms> \ --destination "your-s3-bucket-name" \ --destination-prefix "rds-slow-query-logs/"
Download slow logs
aws rds download-db-log-file-portion \
--db-instance-identifier <db-instance-name> \
--log-file-name slowquery/mysql-slowquery.log \
--starting-token <marker-token> \
--output text >> mysql-slowquery.log
Use Amazon EventBridge to schedule the Lambda function to run periodically (e.g., daily or hourly).
import boto3
def handler(event, context):
rds_client = boto3.client('rds')
s3_client = boto3.client('s3')
db_instance_id = "your-instance-id"
log_file_name = "slowquery/mysql-slowquery.log"
bucket_name = "your-bucket-name"
s3_key = "logs/mysql-slowquery.log"
# Download log file portion
response = rds_client.download_db_log_file_portion(
DBInstanceIdentifier=db_instance_id,
LogFileName=log_file_name,
Marker="0"
)
log_content = response['LogFileData']
# Upload to S3
s3_client.put_object(
Bucket=bucket_name,
Key=s3_key,
Body=log_content
)
return {"status": "success"}