Google Cloud SQL MySQL: how to convert JSON to slow log
Nowadays data volumes are growing very quickly. Today Hundreds of gigabytes database is not even being considered as a huge one. Naturally cloud services like Amazon Relational Database Service (RDS) or Google Cloud SQL services can be successfully used for such cases. This is very handy, you don’t even need to spend your time tuning server side settings, but when with volumes growth the performance usually start to degrade. At this point one needs to start optimizing databases and techniques of working with it.
We’ve been using Google Cloud SQL for MySQL quite a lot recently and indeed it’s a powerful MySQL solution. One of the most common ways to start analyzing is a slow queries log. The thing is Google uses JSON with timestamps to store them. And JSON is convenient for robots, not for humans. This is why we wrote a simple Python3 script to parse JSON files.
You just need to grab the logs to some Linux box:
1 |
gsutil cp gs://slow_log/cloudsql.googleapis.com/mysql-slow.log/2019/02/11/* . |
And concatenate all files into on big log:
1 |
cat *.json >> json_log |
To get classic and familiar format run:
1 2 3 4 5 6 7 8 9 10 11 12 13 |
#!/usr/bin/env python3 import json sql = "" timestamp = "" pretimestamp = "" with open("json_log") as infile: for line in infile: if line.strip()[0] != "{": continue k = json.loads(line) sql = k['textPayload'] print(sql) pretimestamp = k['timestamp'].strip() |
And run
1 |
python js_sl.py >> slowlog |
References:
https://supportex.net (Cross post)