Using CSV Log Output
Prerequisites
- The log_destination parameter is set to csvlog.
- The logging_collector parameter is set to on.
Definition of csvlog
Log lines are emitted in comma separated values (CSV) format.
An example table definition for storing CSV-format log output is shown as follows:
CREATE TABLE gaussdb_log
(
log_time timestamp(3) with time zone,
node_name text,
user_name text,
database_name text,
process_id bigint,
connection_from text,
"session_id" text,
session_line_num bigint,
command_tag text,
session_start_time timestamp with time zone,
virtual_transaction_id text,
transaction_id bigint,
query_id bigint,
module text,
error_severity text,
sql_state_code text,
message text,
detail text,
hint text,
internal_query text,
internal_query_pos integer,
context text,
query text,
query_pos integer,
location text,
application_name text
);
For details, see Table 1.
Table 1 Meaning of each csvlog field
Internal query (This field is used to query the information leading to errors if any.) | |||
Position where errors occur in the openGauss source code if log_error_verbosity is set to verbose | |||
Run the following command to import a log file to this table:
COPY gaussdb_log FROM '/opt/data/pg_log/logfile.csv' WITH csv;
NOTE: The log name (logfile.csv) here needs to be replaced with the name of a log generated.
Simplifying Input
Simplify importing CSV log files by performing the following operations:
- Set log_filename and log_rotation_age to provide a consistent, predictable naming solution for log files. By doing this, you can predict when an individual log file is complete and ready to be imported.
- Set log_rotation_size to 0 to disable size-based log rollback, as it makes the log file name difficult to predict.
- Set log_truncate_on_rotation to on so that old log data cannot be mixed with the new one in the same file.
Feedback