MySQL 数据库慢查询日志分析脚本
Posted yujiaershao
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了MySQL 数据库慢查询日志分析脚本相关的知识,希望对你有一定的参考价值。
这个脚本是基于pt-query-digest做的日志分析脚本,变成可视化的格式。
目录结构是
./mysql_data/log
./mysql_data/log/tmp
./slow_query
# coding = gbk __author__ = ‘T_two‘ import datetime import os IP = ‘111‘ dirname = os.path.dirname(os.path.abspath(__file__)) # 解析后的目录名 slow_query= os.path.join(dirname, ‘slow_query‘) # pt-query-digest前的目录的 mysql_data = os.path.join(os.path.join(dirname, ‘mysql_data‘), ‘log‘) # pt-query-digest后的目录的 tmp = os.path.join(mysql_data, ‘tmp‘) def getYesterday(): today=datetime.date.today() yesterday = str(today - datetime.timedelta(days=1)) return yesterday def getLog(yes_time, slow_query): # 对日志进行pt-query-digest分析 before_name = yes_time.replace(‘-‘, ‘‘) + ‘-‘ + ‘slow-query.log‘ # pt-query-digest之前的日志 b_filename b_filename = os.path.join(mysql_data, before_name) # print(b_filename) # pt-query-digest之后的日志 a_filename after_name = yes_time.replace(‘-‘, ‘‘) + ‘-‘ + IP + ‘-‘ + ‘slow-query.log‘ a_filename = os.path.join(tmp, after_name) # print(a_filename) # 最终格式化的日志 e_filename end_name = IP + ‘-slow-log-‘ + yes_time + ‘.txt‘ e_filename = os.path.join(slow_query, end_name) #print(e_filename) return b_filename,a_filename,e_filename def getSlowquery(b_filename,a_filename,e_filename): print(‘File format starting...‘) #os.system(‘pt-query-digest ‘+ b_filename + ‘>‘ + a_filename) a_slow_query = open(a_filename, ‘r‘, encoding = ‘utf8‘) e_slow_query = open(e_filename, ‘w‘, encoding = ‘utf8‘) _line = ‘‘ line = a_slow_query.readlines()[20:] # 对文件切片,去除不需要的前20行。 for line in line: line = line.strip() # 提取需要的行 if line.startswith(‘#‘) and ‘# Hosts‘ not in line and ‘# Users‘ not in line and ‘# Databases‘ not in line and ‘byte‘ not in line and ‘# Count‘ not in line and ‘# Exec time‘ not in line : pass elif line == ‘‘: pass else: # 序列号 if ‘# Query‘ in line: line = (‘ NO.%s‘ % line.split()[2]) # 执行次数 elif ‘# Count‘ in line: line = (‘执行次数: %s‘ % line.split()[3]) #执行时间 elif ‘# Exec time‘ in line: line = (‘执行时间 Total: %s min: %s max: %s‘ % (line.split()[4],line.split()[5],line.split()[6],)) # DB elif ‘# Databases‘ in line: line = (‘库名: %s‘ % line.split()[2]) # 源IP elif ‘# Host‘ in line:line = (‘源IP: %s‘ % line.split()[2]) # 用户名 elif ‘# User‘ in line: line = (‘用户名: %s‘ % line.split()[2]) _line = _line + line + ‘ ‘ e_slow_query.write(_line) a_slow_query.close() e_slow_query.close() # 将文件拷贝到web目录下 os.system(‘cp ‘ + e_filename + ‘ ‘ + web_dir) # 删除10天之前的数据 os.system(‘find ‘ + str(slow_query) + ‘ -mtime +10 | xargs rm -rf ‘) os.system(‘find ‘ + mysql_data + ‘ -mtime +10 | xargs rm -rf ‘) os.system(‘find ‘ + tmp + ‘ -mtime +10 | xargs rm -rf ‘) print (‘File format end...‘) if __name__ == ‘__main__‘: yes_time = getYesterday() b_filename,a_filename, e_filename = getLog(yes_time, slow_query) getSlowquery(b_filename,a_filename,e_filename)
解析之后显示的结果:
以上是关于MySQL 数据库慢查询日志分析脚本的主要内容,如果未能解决你的问题,请参考以下文章