How To: Configure a custom log format

From burstekwiki
Jump to: navigation, search

Using the LASE Custom Log Format Option

bt-LogAnalyzer SETM v.1.61 now has the ability to configure custom log formats, allowing users to report on log file formats not previously included in the software. To utilize this feature, the custom log format must first be defined and tested to ensure proper log file field mapping.

To define a custom log format:

  1. Navigate to the Settings | Properties menu item in the bt-LogAnalyzer SETM user web interface.

    Properties settings menu.png
  2. Select the Custom Log Formats tab.

    Note Note:

    It is important to understand that either modifying the log file format which uses the custom log format, or modifying the custom log format mapping after data has been loaded can cause invalid data to be loaded into the LogAnalyzer databases.
    CLF main.png

  3. Select the New Log Format button to configure and create your new custom log format

    CLF format options.png

  4. Name and describe the new custom log source as desired, then configure the options needed: (See descriptions below)

    Text qualifier: - The text qualifier is used as a string parameter to wrap text containing special characters that can be otherwise treated as delimiters.

    Example, if no text qualifier is specified for a custom log source which uses a comma (,) as a column delimiter, then the following record may not be loaded as desired:, 2012-05-05, “Lorem ipsum dolor sit amet, consectetur adipiscing elit.”

    Result without a text qualifier specified: 2012-05-05 “Lorem ipsum dolor sit amet consectetur adipiscing elit.”

    Result with a text qualifier specified as quotation marks (“): 2012-05-05 “Lorem ipsum dolor sit amet consectetur adipiscing elit.”

    Has header – (check box) – Select this option if the log files contain header information.

    • Header rows count - (option) – number of rows before first log entry in file.

    • Header rows prefix - (option) – alternative to Header rows count: parameter. Instead of determining header rows by a preset number, header rows are determined by a prefix. Row will be identified as a header row if prefixed with the configured string sequence.

    Has column names row' – (check box): If applicable, LASE will be able to automatically detect column meanings by their names.
    • Column names row number - (option) – numeric value. LASE assumes that row with number defined by this parameter (default is 1), contains delimited column names. User must specify column meanings in Column Mapping tab (by using the auto-mapping feature, or manually).

    • Column names row prefix - (option) – alternative to Column names row number: parameter, instead of finding column names row by row number, LASE will try to find row that starts with string value of this parameter and read the rest of the row as column names. Example value - “# Fields: ”

    • Column names delimiter – String parameter which specifies which delimiter will be used to separate column names in header row. Available options are:

      • Semicolon {;}

      • Colon {:}

      • Comma {,}

      • Tab {t}

      • Space {s}

      • Vertical Bar {|}

    Row delimiter - determine which symbol is used to split rows in log file. Available options are:
    • {CR}{LF}

    • {CR}

    • {LF}

    Column delimiter – determine which symbol is used to split columns in log file. Available options are:
    • Semicolon {;}

    • Colon {:}

    • Comma {,}

    • Tab {t}

    • Space {s}

    • Vertical Bar {|}

  5. Once the Format Options tab has been configured, select the Column Mapping tab and configure the correct column mapping for the custom log format.

    • If the Has header and Has column names row checkboxes were selected on the Format Options tab, then the Auto mapping feature will be available. This feature allows a user to have LogAnalyzer automatically suggest the column types by testing a valid log data file.

      • Included below, is a table listing header column names which will auto-map to their respective fields in LASE:

    Column name in log data file header Field in LASE
    ClientIP, src_ip, %Ses->client.ip%, src, c-ip Client IP
    ClientUserName, src_user, %Req->vars.pauth-user%, user, usr, cs-username Client username
    logDate, date Date
    logTime,time Time
    GmtLogTime, [%SYSDATE%] Date and TIme
    Processingtime, %Req->vars.xfer-time%, time-taken Processing time
    Bytessent, sent_bytes, %Req->vars.p2r-cl%, sent, cs-bytes Bytes sent
    Bytesrecvd, rcvd_bytes, %Req->vars.p2c-cl%, %Req->headers.content-length%, %Req->vars.r2p-cl%, rcvd, sc-bytes Bytes received
    protocol, proto, cs-protocol Protocol name
    Operation, s-operation Operation name
    uri, %Req->reqpb.uri%, cs-uri, cs-url Object name
    "%Req->reqpb.proxy-request%" Proxy request
    mimetype, cs-mime-type Object mime
    Resultcode, rc, %Req->srvhdrs.clf-status%, result, sc-http-status, sc-status Result code
    s-hostname, cs-host, dstname Uri host
    cs-uri-stem Uri stem
    cs-uri-path Uri path
    cs-uri-query, arg Uri query
    cs-uri-scheme Uri scheme

    Note Note:

    The Object name Field is synonymous with URL, which is a valid field in LASE. URL is not included in the above table because there is no auto-mapping pointer for this field. Assuming the correct header column name, the Object name LASE field would be mapped for a column containing a full URL.
    • To run the auto mapping feature:

      1. Select the log file to be tested by clicking the Browse... button, then selecting the desired file, and clicking OK.

      2. Once the sample log file has been selected, click the Determine button. The Column mapping section should then display the resulting columns of the auto map feature.

    CLF column mapping auto man.png

    Note Note:

    The auto-mapping results can be modified by manually configuring the column mapping after the auto-map feature has been used.
    • If the Auto-mapping feature will not be used, the log file columns can be manually configured. To manually configure the column mapping:

      1. Click the add... button, located under the Column mapping section.

        CLF log format properties field selection.png

      2. Select the necessary column type which is included in the log file, then click OK.

      3. Add the additional columns until all the log file columns have been added to the Column mapping section.

      4. The position of the column mappoing can be changed by selecting the desired column items, then clicking the Move Up or Move Down aarrow buttons.

    CLF Column Mapping.png

    Note Note:

    There are 4-6 required fields for any log file (excluding Exchange logs) that LogAnalyzer can successfully report on. The minimum required fields (no particular order is necessary) are included below:
    • Client IP
    • Date anmd Time OR
      • Date AND Time (if date and time not combined into a single field, then two separate fields are required for accurate reporting).
    • Result Code
    • URL (or ObjectName) OR
      • UriHost AND UriPath AND UriQuery OR
      • UriScheme AND UriStem AND UriQuery
  6. Once the column mapping has been configured for the custom log format, navigate to the Preview tabe to ensure proper log format configuration. To test your custom log format against the newly-configured custom log format:

    1. Select your log file by using the Browse... button, the click OK
      2. Once the desired log file has been selected, click the Preview button.

      CLF Preview.png

      3. Verify that the correct content is included in each column, and that the content is complete.
      4. If the log fole column data seems to be mapped correctly, click OK to save and close the custom log format properties page.

    Once the Custom Log Format has been properly configured and tested, it can then be used as a log format for the related log source. The custom log format can be found in the Log files format: drop-down option of the log source properties page.
    LASE Custom Log Format Selection.png

Personal tools