You are reading Edgio v6 docs. Check out our latest docs for Edgio v7.
Edgio
Edgio

Azure Blob Storage Log Delivery

RTLD may automatically deliver compressed log data to an Azure Blob Storage container by submitting HTTPS PUT requests to it. Each request creates a block blob within the container. This block blob contains a compressed JSON or CSV document that uniquely identifies a set of log data and describes one or more log entries.

Key information:

  • The set of available log fields varies by RTLD module: RTLD CDN | RTLD Rate Limiting | RTLD WAF

  • RTLD applies gzip compression to log data. Azure Blob Storage stores compressed log data as an object with a gz file extension.

    Learn more.

  • Setting up log delivery to Azure Blob Storage requires:

    • An existing Azure Blob Storage account.

      Get started.

    • A container to which log data will be uploaded.

    • A base URL that points to your container.

      Blob Container URL: https://Storage Account.blob.core.windows.net/<CONTAINER>

      Sample Blob Container URL: https://myaccount.blob.core.windows.net/mycontainer

    • Either a SAS token or an access key through which our service will authorize requests to upload content to your Azure Blob Storage account.

      If you plan on providing a SAS token, make sure that the token has permission to write to the blob/container. Additionally, it should start with sv= and it should not include a ?.

      Sample SAS token:

      sv=2018-03-28&sr=c&si=myblobReadWritekey1\_123456789012345678&sig=a1bCDefghijklMnOpqrsTuv2wXYzABc3d34efGHIjkL%5M

  • You may define a prefix when setting up a log delivery profile. This prefix defines a virtual log file storage location and/or a prefix that will be pre-pended to the name of each object added to your bucket. Use the following guidelines when setting this prefix:

    • A prefix should not start with a forward slash.
    • A forward slash within the specified prefix is interpreted as a delimiter for a virtual directory.
    • A trailing forward slash means that the specified value only defines a virtual directory path within your bucket where logs will be stored. If the specified value ends in a character other than a forward slash, then the characters specified after the forward slash will be prepended to the file name for each log file uploaded to your destination.

    Sample prefix: logs/CDN/siteA_

    The above prefix will store log files in the following virtual directory: /logs/CDN

    The file name for each log file uploaded to your destination will start with siteA_.

    Sample log file name: siteA_wpc_0001_123_20220111_50550000F98AB95B_1.json

To prepare for log delivery

  1. Create or identify an Azure storage account and a container to which log data will be posted.

    View Microsoft Azure documentation on how to create a storage account.

  2. Identify or configure how requests submitted will be submitted by RTLD will be authorized.

    RTLD supports authorization through a SAS token or an access key.

    If you plan on providing a SAS token, make sure that the token has permission to write to the blob/container. Additionally, it should start with sv= and it should not include a ?.

  3. Upon completing the above steps, you should create a log delivery profile for Azure Blob Storage.

To set up a log delivery profile

  1. From the Real-Time Log Delivery CDN page, click + New Log Delivery Profile.

    1. Open the desired property.

      1. Select either your private space or a team space.
      2. Click on the desired property.
    2. From the left pane, click on the desired environment.

    3. From the left pane, click Realtime Log Delivery.

  2. From the Profile Name option, assign a name to this log delivery profile.

  3. From the Log Delivery Method option, select Azure Blob Storage.

  4. Define how RTLD will communicate with Azure Blob Storage.

    1. Set the Blob Container URL option to a URL that points to the container to which log data will be posted.

    2. Optional. Set the Prefix option to a value that defines a virtual log file storage location and/or a prefix that will be added to each log file added to your container.

      Learn more.

    3. From the Access Type option, select whether log data uploads will be authorized via a SAS token or an access key and then paste it in the field below it.

      If you plan on providing a SAS token, make sure that the token has permission to write to the blob/container. Additionally, it should start with sv= and it should not include a ?.

  5. From the Log Format option, select whether to format log data using our standard JSON format, as a JSON array, as JSON lines, or as a CSV (RTLD CDN only).

    Learn more: RTLD CDN | RTLD Rate Limiting | RTLD WAF

  6. From the Downsample the Logs option, determine whether to reduce the amount of log data that will be delivered. For example, you may choose to only deliver 1% of your log data.

    • All Log Data: Verify that the Downsample the Logs option is cleared.

    • Downsampled Log Data: Downsample logs to 0.1%, 1%, 25%, 50%, or 75% of total log data by enabling the Downsample the Logs option and then selecting the desired rate from the Downsampling Rate option.

      Use this capability to reduce the amount of data that needs to be processed or stored within your web server(s).

      RTLD CDN Only: Downsampling log data also reduces usage charges for this service.

  7. Determine whether log data will be filtered.

  8. By default, all log fields are enabled on a new log delivery profile. Clear each field for which log data should not be reported. Adjust the set of log fields that will be included within this log delivery profile from within the Fields section.

    Log fields are categorized. You may add or remove individual fields by expanding a category and then marking or clearing specific log fields. Alternatively, add or remove all of the log fields associated with a category by marking or clearing the desired category.

    RTLD CDN Only: You may also log request headers, response headers, and cookies by adding them through the Custom Request Headers, Custom Response Headers, and Custom Cookies options.i

    You may either select the name of the desired header or cookie, or type its name and then press ENTER. Click on the list to add additional headers or cookies. Remove a header or cookie by clicking on its x.

    Although other settings take effect quickly, it may take up to 90 minutes before data for custom request/response headers and cookies is logged.

    TODO: Verify

  9. Click Create Log Delivery Profile.

Filtering Log Data

Filter log data to only include relevant information and to reduce the amount of data being ingested. Filtering options vary by RTLD module.

An alternative method for reducing the amount of log data sent to your destination is downsampling. However, downsampling log data is indiscriminate, while filtering allows you to target the set of traffic that is most relevant to your business needs.

Filtering RTLD CDN Log Data

You may filter by:

  • Hostname: Filter log data to only include traffic directed to the desired hostname(s). Set up hostname filtering within the Filter by Hostname section.

    • Filter log data by one or more hostname(s) by:

      1. Determine whether log data will be filtered to include or exclude requests to the selected hostname(s) by selecting either Matches or Does Not Match, respectively.
      2. Click within the Hostnames option and select the desired hostname(s).

      Filter the list by typing the entire or partial hostname. For example, typing co will filter the list to include all hostnames that contain co (e.g., cdn.example.com and corp.example.org).

    • Upload all log data regardless of hostname: Verify that a hostname has not been defined within the Hostnames option.

      Remove a hostname by clicking on its x.

  • User Agent: Filter log data to only include traffic that was requested by a client whose user agent matches a RE2-compatible regular expression pattern. Set up user agent filtering within the Filter by User Agent option.

    • Filter log data by user agent: Type a RE2-compatible regular expression pattern that identifies the set of user agents by which log data will be filtered.

    • Upload all log data regardless of user agent: Set it to a blank value.

  • Status Code: Filter log data to only include traffic for specific status code(s). Set up status code filtering within the Filter by Status Code section.

    • Filter log data by status code: Select each status code class (e.g., 2xx or 3xx) for which log data will be delivered.

    • Upload all log data regardless of status code: Verify that a status code class (e.g., 2xx and 3xx) has not been defined within this option.

      Remove a status code class by clicking on its x.