gplogfilter
Searches through HAWQ log files for specified entries.
Synopsis
gplogfilter [<timestamp_options>] [<pattern_matching_options>]
[<output_options>] [<input_options>]
gplogfilter --help
gplogfilter --version
where:
<timestamp_options> =
[-b <datetime> | --begin <datetime>]
[-e <datetime> | --end <datetime>]
[-d <time> | --duration <time>]
<pattern_matching_options> =
[-c i[gnore] | r[espect] | --case i[gnore] | r[espect]]
[-C '<string>' | --columns '<string>']
[-f '<string>' | --find '<string>']
[-F '<string> | --nofind '<string>']
[-m <regex> | --match <regex>]
[-M <regex>] | --nomatch <regex>]
[-t | --trouble]
<output_options> =
[-n <integer> | --tail <integer>]
[-s <offset> [<limit>] | --slice <offset> [<limit>]]
[-o <output_file> | --out <output_file>]
[-z <0..9> | --zip <0..9>]
[-a | --append]
<input_options> =
[<input_file> [-u | --unzip]]
Description
The gplogfilter
utility can be used to search through a HAWQ log file for entries matching the specified criteria. To read from standard input, use a dash (-
) as the input file name. Input files may be compressed using gzip
. In an input file, a log entry is identified by its timestamp in YYYY-MM-DD [hh:mm[:ss]]
format.
You can also use gplogfilter
to search through all segment log files at once by running it through the hawq ssh utility. For example, to display the last three lines of each segment log file:
$ hawq ssh -f seg_hostfile_hawqssh
=> source /usr/local/hawq/greenplum_path.sh
=> gplogfilter -n 3 /data/hawq-install-path/segmentdd/pg_log/hawq*.csv
By default, the output of gplogfilter
is sent to standard output. Use the -o
option to send the output to a file or a directory. If you supply an output file name ending in .gz
, the output file will be compressed by default using maximum compression. If the output destination is a directory, the output file is given the same name as the input file.
Options
<input_options>
-
) as the input file name.gunzip
. If the input file name ends in .gz
, it will be uncompressed by default.<output_options>
0
) denotes the first entry in the result set and an <offset> of any number greater than zero counts back from the end of the result set.STDOUT
.gzip
, where 0
is no compression and 9
is maximum compression. If you supply an output file name ending in .gz
, the output file will be compressed by default using maximum compression.<pattern_matching_options>
--case=ignore
option.ERROR:
, FATAL:
, or PANIC:
in the first line.<timestamp_options>
YYYY-MM-DD [hh:mm[:ss]]
.
If a time is specified, the date and time must be enclosed in either single or double quotes. This example encloses the date and time in single quotes:
$ gplogfilter -b '2016-02-13 14:23'
YYYY-MM-DD [hh:mm[:ss]]
.
If a time is specified, the date and time must be enclosed in either single or double quotes. This example encloses the date and time in single quotes:
$ gplogfilter -e '2016-02-13 14:23'
[hh][:mm[:ss]]
. If used without either the -b
or -e
option, will use the current time as a basis.Other Options
Examples
Display the last three error messages in the identified log file:
$ gplogfilter -t -n 3 "/data/hawq/master/pg_log/hawq-2016-09-01_134934.csv"
Display the last five error messages in a date-specified log file:
$ gplogfilter -t -n 5 "/data/hawq-file-path/hawq-yyyy-mm-dd*.csv"
Display all log messages in the date-specified log file timestamped in the last 10 minutes:
$ gplogfilter -d :10 "/data/hawq-file-path/hawq-yyyy-mm-dd*.csv"
Display log messages in the identified log file containing the string |con6 cmd11|
:
$ gplogfilter -f '|con6 cmd11|' "/data/hawq/master/pg_log/hawq-2016-09-01_134934.csv"
Using hawq ssh, run gplogfilter
on the segment hosts and search for log messages in the segment log files containing the string con6
and save output to a file.
$ hawq ssh -f /data/hawq-2.x/segmentdd/pg_hba.conf -e 'source /usr/local/hawq/greenplum_path.sh ;
gplogfilter -f con6 /data/hawq-2.x/pg_log/hawq*.csv' > seglog.out