diff --git a/docs/configuration.md b/docs/configuration.md index 390e85b..53af10e 100644 --- a/docs/configuration.md +++ b/docs/configuration.md @@ -147,7 +147,7 @@ TA in Splunk. The administrator will need to ensure all recommneded indexes be are not changed. It is understood that default values will need to be changed in many installations. To accomodate this, each filter consults -a lookup file that is mounted to the container (by default `/opt/sc4s/local/context/splunk_index.csv`) and is populated with +a lookup file that is mounted to the container (by default `/opt/sc4s/local/context/splunk_metadata.csv`) and is populated with defaults on the first run of SC4S after being set up according to the "getting started" runtime documents. This is a CSV file containing a "key" that is referenced in the log path for each data source. These keys are documented in the individual source files in this section, and allow one to override Splunk metadata either in whole or part. The use of this file is best @@ -158,7 +158,7 @@ page in this section: |------------------------|---------------------|----------------|---------------| | juniper_netscreen | netscreen:firewall | netfw | none | -Here is a snippet from the `splunk_indexes.csv` file: +Here is a snippet from the `splunk_metadata.csv` file: ```bash juniper_netscreen,index,ns_index @@ -185,7 +185,7 @@ In general, for most deployments the index should be the only change needed; oth never be overridden (particularly for the "Out of the Box" data sources). Even then, care should be taken when considering any alternates, as the defaults for SC4S were chosen with best practices in mind. -The `splunk_indexes.csv` file should also be appended to (with a "commented out" default for the index) when building custom SC4S log paths +The `splunk_metadata.csv` file should also be appended to (with a "commented out" default for the index) when building custom SC4S log paths (filters). Care should be taken during filter design to choose appropriate index, sourctype and template defaults, so that admins are not compelled to override them. @@ -198,7 +198,7 @@ which maps to an associated lookup of alternate indexes, sources, or other metad added to futher classify the data. * The `conf` and `csv` files referenced below will be populated into the `/opt/sc4s/local/context` directory when SC4S is run for the first -time after being set up according to the "getting started" runtime documents, in a similar fashion to `splunk_indexes.csv`. +time after being set up according to the "getting started" runtime documents, in a similar fashion to `splunk_metadata.csv`. After this first-time population of the files takes place, they can be edited (and SC4S restarted) for the changes to take effect. To get started: * Edit the file ``compliance_meta_by_source.conf`` to supply uniquely named filters to identify events subject to override. diff --git a/docs/gettingstarted/docker-swarm-general.md b/docs/gettingstarted/docker-swarm-general.md index dea3f12..b2f68ae 100644 --- a/docs/gettingstarted/docker-swarm-general.md +++ b/docs/gettingstarted/docker-swarm-general.md @@ -85,9 +85,8 @@ that are not provided out of the box in SC4S. To get you started, there is an e and a filter (`example.conf`) in the `log_paths` and `filters` subdirectories, respectively. These should _not_ be used directly, but copied as templates for your own log path development. They _will_ get overwritten at each SC4S start. - * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_index.csv`) the changes -will be preserved on a restart. However, the "example" files _themselves_ (e.g. `splunk_index.csv.example`) will be updated -regularly, and should be used as a template to merge new/changed functionality into existing context files. + * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_metadata.csv`) the changes +will be preserved on a restart. * Create the subdirectory ``/opt/sc4s/archive``. This will be used as a mount point for local storage of syslog events (if the optional mount is uncommented above). The events will be written in the syslog-ng EWMM format. See the "configuration" @@ -174,7 +173,7 @@ can be ammended with additional ``target`` stanzas in the ``ports`` section of t Log paths are preconfigured to utilize a convention of index destinations that are suitable for most customers. * If changes need to be made to index destinations, navigate to the ``/opt/sc4s/local/context`` directory to start. -* Edit `splunk_index.csv` to review or change the index configuration and revise as required for the data sources utilized in your +* Edit `splunk_metadata.csv` to review or change the index configuration and revise as required for the data sources utilized in your environment. Simply uncomment the relevant line and enter the desired index. The "Sources" document details the specific entries in this table that pertain to the individual data source filters that are included with SC4S. * Other Splunk metadata (e.g. source and sourcetype) can be overriden via this file as well. This is an advanced topic, and further diff --git a/docs/gettingstarted/docker-swarm-rhel7.md b/docs/gettingstarted/docker-swarm-rhel7.md index 8c1d6f1..7288430 100644 --- a/docs/gettingstarted/docker-swarm-rhel7.md +++ b/docs/gettingstarted/docker-swarm-rhel7.md @@ -93,9 +93,8 @@ that are not provided out of the box in SC4S. To get you started, there is an e and a filter (`example.conf`) in the `log_paths` and `filters` subdirectories, respectively. These should _not_ be used directly, but copied as templates for your own log path development. They _will_ get overwritten at each SC4S start. - * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_index.csv`) the changes -will be preserved on a restart. However, the "example" files _themselves_ (e.g. `splunk_index.csv.example`) will be updated -regularly, and should be used as a template to merge new/changed functionality into existing context files. + * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_metadata.csv`) the changes +will be preserved on a restart. * Create the subdirectory ``/opt/sc4s/archive``. This will be used as a mount point for local storage of syslog events (if the optional mount is uncommented above). The events will be written in the syslog-ng EWMM format. See the "configuration" @@ -182,7 +181,7 @@ can be ammended with additional ``target`` stanzas in the ``ports`` section of t Log paths are preconfigured to utilize a convention of index destinations that are suitable for most customers. * If changes need to be made to index destinations, navigate to the ``/opt/sc4s/local/context`` directory to start. -* Edit `splunk_index.csv` to review or change the index configuration and revise as required for the data sources utilized in your +* Edit `splunk_metadata.csv` to review or change the index configuration and revise as required for the data sources utilized in your environment. Simply uncomment the relevant line and enter the desired index. The "Sources" document details the specific entries in this table that pertain to the individual data source filters that are included with SC4S. * Other Splunk metadata (e.g. source and sourcetype) can be overriden via this file as well. This is an advanced topic, and further diff --git a/docs/gettingstarted/docker-systemd-general.md b/docs/gettingstarted/docker-systemd-general.md index 70f2128..8281efb 100644 --- a/docs/gettingstarted/docker-systemd-general.md +++ b/docs/gettingstarted/docker-systemd-general.md @@ -89,9 +89,8 @@ that are not provided out of the box in SC4S. To get you started, there is an e and a filter (`example.conf`) in the `log_paths` and `filters` subdirectories, respectively. These should _not_ be used directly, but copied as templates for your own log path development. They _will_ get overwritten at each SC4S start. - * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_index.csv`) the changes -will be preserved on a restart. However, the "example" files _themselves_ (e.g. `splunk_index.csv.example`) will be updated -regularly, and should be used as a template to merge new/changed functionality into existing context files. + * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_metadata.csv`) the changes +will be preserved on a restart. * Create the subdirectory ``/opt/sc4s/archive``. This will be used as a mount point for local storage of syslog events (if the optional mount is uncommented above). The events will be written in the syslog-ng EWMM format. See the "configuration" @@ -164,7 +163,7 @@ ExecStart=/usr/bin/docker run -p 514:514 -p 514:514/udp -p 6514:6514 -p 5000-502 Log paths are preconfigured to utilize a convention of index destinations that are suitable for most customers. * If changes need to be made to index destinations, navigate to the ``/opt/sc4s/local/context`` directory to start. -* Edit `splunk_index.csv` to review or change the index configuration and revise as required for the data sources utilized in your +* Edit `splunk_metadata.csv` to review or change the index configuration and revise as required for the data sources utilized in your environment. Simply uncomment the relevant line and enter the desired index. The "Sources" document details the specific entries in this table that pertain to the individual data source filters that are included with SC4S. * Other Splunk metadata (e.g. source and sourcetype) can be overriden via this file as well. This is an advanced topic, and further diff --git a/docs/gettingstarted/podman-systemd-general.md b/docs/gettingstarted/podman-systemd-general.md index fedde53..80f1429 100644 --- a/docs/gettingstarted/podman-systemd-general.md +++ b/docs/gettingstarted/podman-systemd-general.md @@ -108,9 +108,8 @@ that are not provided out of the box in SC4S. To get you started, there is an e and a filter (`example.conf`) in the `log_paths` and `filters` subdirectories, respectively. These should _not_ be used directly, but copied as templates for your own log path development. They _will_ get overwritten at each SC4S start. - * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_index.csv`) the changes -will be preserved on a restart. However, the "example" files _themselves_ (e.g. `splunk_index.csv.example`) will be updated -regularly, and should be used as a template to merge new/changed functionality into existing context files. + * In the `local/context` directory, if you change the "non-example" version of a file (e.g. `splunk_metadata.csv`) the changes +will be preserved on a restart. * Create the subdirectory ``/opt/sc4s/archive``. This will be used as a mount point for local storage of syslog events (if the optional mount is uncommented above). The events will be written in the syslog-ng EWMM format. See the "configuration" @@ -183,7 +182,7 @@ ExecStart=/usr/bin/podman run -p 514:514 -p 514:514/udp -p 6514:6514 -p 5000-502 Log paths are preconfigured to utilize a convention of index destinations that are suitable for most customers. * If changes need to be made to index destinations, navigate to the ``/opt/sc4s/local/context`` directory to start. -* Edit `splunk_index.csv` to review or change the index configuration and revise as required for the data sources utilized in your +* Edit `splunk_metadata.csv` to review or change the index configuration and revise as required for the data sources utilized in your environment. Simply uncomment the relevant line and enter the desired index. The "Sources" document details the specific entries in this table that pertain to the individual data source filters that are included with SC4S. * Other Splunk metadata (e.g. source and sourcetype) can be overriden via this file as well. This is an advanced topic, and further diff --git a/docs/sources/Checkpoint/index.md b/docs/sources/Checkpoint/index.md index 3bb9027..ffe44c5 100644 --- a/docs/sources/Checkpoint/index.md +++ b/docs/sources/Checkpoint/index.md @@ -48,7 +48,7 @@ The Splunk `host` field will be derived as follows ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Follow vendor configuration steps per Product Manual above ### Options diff --git a/docs/sources/Cisco/index.md b/docs/sources/Cisco/index.md index 187f28d..53a22aa 100644 --- a/docs/sources/Cisco/index.md +++ b/docs/sources/Cisco/index.md @@ -128,7 +128,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Follow vendor configuration steps per Product Manual above ensure: * Log Level is 6 "Informational" * Protocol is TCP/IP @@ -200,7 +200,7 @@ Cisco Network Products of multiple types share common logging characteristics th ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * IOS Follow vendor configuration steps per Product Manual above ensure: * Ensure a reliable NTP server is set and synced * Log Level is 6 "Informational" @@ -315,7 +315,7 @@ IP, Netmask, Host or Port ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Follow vendor configuration steps per Product Manual above ### Options diff --git a/docs/sources/Citrix/index.md b/docs/sources/Citrix/index.md index 6bf7d4c..af78aa2 100644 --- a/docs/sources/Citrix/index.md +++ b/docs/sources/Citrix/index.md @@ -28,7 +28,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Follow vendor configuration steps per Product Manual above. Ensure the data format selected is "DDMMYYYY" ### Options diff --git a/docs/sources/Dell_RSA/index.md b/docs/sources/Dell_RSA/index.md index 5c34d1e..0070e8f 100644 --- a/docs/sources/Dell_RSA/index.md +++ b/docs/sources/Dell_RSA/index.md @@ -34,7 +34,7 @@ NOTE: Java trace and exception will default to sc4s:fallback if the host/ip filt ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration ### Options diff --git a/docs/sources/F5/index.md b/docs/sources/F5/index.md index b063a17..c80f847 100644 --- a/docs/sources/F5/index.md +++ b/docs/sources/F5/index.md @@ -41,7 +41,7 @@ Must be identified by host or ip assignment. Update the filter `f_f5_bigip` or c ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration ### Options diff --git a/docs/sources/Forcepoint/index.md b/docs/sources/Forcepoint/index.md index e5fdeff..61c4bcd 100644 --- a/docs/sources/Forcepoint/index.md +++ b/docs/sources/Forcepoint/index.md @@ -28,7 +28,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration to send Reliable syslog using RFC 3195 format, a typical logging configuration will include the following features. diff --git a/docs/sources/Fortinet/index.md b/docs/sources/Fortinet/index.md index 147d1d6..2b3d1c8 100644 --- a/docs/sources/Fortinet/index.md +++ b/docs/sources/Fortinet/index.md @@ -71,7 +71,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration to send Reliable syslog using RFC 3195 format, a typical logging configuration will include the following features. ``` @@ -181,7 +181,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration to send Reliable syslog using RFC 3195 format, a typical logging configuration will include the following features. ``` diff --git a/docs/sources/InfoBlox/index.md b/docs/sources/InfoBlox/index.md index 627d8fc..61b52ba 100644 --- a/docs/sources/InfoBlox/index.md +++ b/docs/sources/InfoBlox/index.md @@ -35,7 +35,7 @@ Must be identified by host or ip assignment. Update the filter `f_infoblox` or c ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration ### Options diff --git a/docs/sources/Juniper/index.md b/docs/sources/Juniper/index.md index a94aedf..8110f6b 100644 --- a/docs/sources/Juniper/index.md +++ b/docs/sources/Juniper/index.md @@ -34,7 +34,7 @@ ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index as required. +* Review and update the splunk_metadata.csv file and set the index as required. * Follow vendor configuration steps per referenced Product Manual ### Options @@ -86,7 +86,7 @@ Verify timestamp, and host values match as expected ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index as required. +* Review and update the splunk_metadata.csv file and set the index as required. * Follow vendor configuration steps per Product Manual ### Options diff --git a/docs/sources/PaloaltoNetworks/index.md b/docs/sources/PaloaltoNetworks/index.md index 2fae016..71297ce 100644 --- a/docs/sources/PaloaltoNetworks/index.md +++ b/docs/sources/PaloaltoNetworks/index.md @@ -39,7 +39,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the admin manual for specific details of configuration * Select TCP or SSL transport option * Select IETF Format diff --git a/docs/sources/Pfsense/index.md b/docs/sources/Pfsense/index.md index 46e1af4..09e3fcb 100644 --- a/docs/sources/Pfsense/index.md +++ b/docs/sources/Pfsense/index.md @@ -33,7 +33,7 @@ Source does not provide a hostname, port or IP based filter is required ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Configure a dedicated SC4S port OR configure IP filter * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option diff --git a/docs/sources/Proofpoint/index.md b/docs/sources/Proofpoint/index.md index 1fac35c..f06407b 100644 --- a/docs/sources/Proofpoint/index.md +++ b/docs/sources/Proofpoint/index.md @@ -32,7 +32,7 @@ messages to create meaningful final output. This will require follow-on process ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Follow vendor configuration steps per referenced Product Manual ### Options diff --git a/docs/sources/Symantec/index.md b/docs/sources/Symantec/index.md index 797ee62..47ec0f1 100644 --- a/docs/sources/Symantec/index.md +++ b/docs/sources/Symantec/index.md @@ -83,7 +83,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option * Ensure the format of the event is customized as follows @@ -138,7 +138,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * No TA available -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option * Ensure the format of the event is customized per Splunk documentation diff --git a/docs/sources/Ubiquiti/index.md b/docs/sources/Ubiquiti/index.md index 1769377..e671337 100644 --- a/docs/sources/Ubiquiti/index.md +++ b/docs/sources/Ubiquiti/index.md @@ -52,7 +52,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option * Ensure the format of the event is customized per Splunk documentation diff --git a/docs/sources/VMWare/index.md b/docs/sources/VMWare/index.md index fa7cd76..e4a820f 100644 --- a/docs/sources/VMWare/index.md +++ b/docs/sources/VMWare/index.md @@ -31,7 +31,7 @@ MSG Parse: This filter parses message content when using the default configurati ### Setup and Configuration -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option * Ensure the format of the event is customized per Splunk documentation diff --git a/docs/sources/Zscaler/index.md b/docs/sources/Zscaler/index.md index c087e3b..504a531 100644 --- a/docs/sources/Zscaler/index.md +++ b/docs/sources/Zscaler/index.md @@ -41,7 +41,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option * Ensure the format of the event is customized per Splunk documentation @@ -103,7 +103,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. * Refer to the Splunk TA documentation for the specific customer format required for proxy configuration * Select TCP or SSL transport option * Ensure the format of the event is customized per Splunk documentation diff --git a/docs/sources/nix/index.md b/docs/sources/nix/index.md index 7be21b0..47d42bd 100644 --- a/docs/sources/nix/index.md +++ b/docs/sources/nix/index.md @@ -36,7 +36,7 @@ MSG Parse: This filter parses message content ### Setup and Configuration * Install the Splunk Add-on on the search head(s) for the user communities interested in this data source. If SC4S is exclusively used the addon is not required on the indexer. -* Review and update the splunk_index.csv file and set the index and sourcetype as required for the data source. +* Review and update the splunk_metadata.csv file and set the index and sourcetype as required for the data source. ### Options diff --git a/package/etc/conf.d/conflib/_splunk/splunk_context.conf b/package/etc/conf.d/conflib/_splunk/splunk_context.conf index 6fb181d..13a04bf 100644 --- a/package/etc/conf.d/conflib/_splunk/splunk_context.conf +++ b/package/etc/conf.d/conflib/_splunk/splunk_context.conf @@ -1,7 +1,7 @@ block parser p_add_context_splunk(key("syslogng-fallback")) { add-contextual-data( selector("`key`"), - database("conf.d/local/context/splunk_index.csv"), + database("conf.d/local/context/splunk_metadata.csv"), prefix(".splunk.") ); }; diff --git a/package/etc/context_templates/splunk_index.csv.example b/package/etc/context_templates/splunk_metadata.csv.example similarity index 100% rename from package/etc/context_templates/splunk_index.csv.example rename to package/etc/context_templates/splunk_metadata.csv.example diff --git a/package/sbin/entrypoint.sh b/package/sbin/entrypoint.sh index 0faf2db..0ae034e 100755 --- a/package/sbin/entrypoint.sh +++ b/package/sbin/entrypoint.sh @@ -47,15 +47,21 @@ mkdir -p /opt/syslog-ng/etc/conf.d/local/config/ cp /opt/syslog-ng/etc/context_templates/* /opt/syslog-ng/etc/conf.d/local/context for file in /opt/syslog-ng/etc/conf.d/local/context/*.example ; do cp --verbose -n $file ${file%.example}; done -#splunk_indexes.csv updates +#splunk_index.csv updates #Remove comment headers from existing config -touch /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv -sed -i 's/^#//' /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv +touch /opt/syslog-ng/etc/conf.d/local/context/splunk_metadata.csv +if [ -f /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv ]; then + LEGACY_SPLUNK_INDEX_FILE=/opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv +fi +sed -i 's/^#//' # Add new entries -awk '{print $0}' /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv.example | sort -b -t ',' -k1,2 -u +awk '{print $0}' ${LEGACY_SPLUNK_INDEX_FILE} /opt/syslog-ng/etc/conf.d/local/context/splunk_metadata.csv /opt/syslog-ng/etc/conf.d/local/context/splunk_metadata.csv.example | sort -b -t ',' -k1,2 -u #We don't need this file anylonger -rm -f /opt/syslog-ng/etc/context_templates/splunk_index.csv.example - +rm -f /opt/syslog-ng/etc/context_templates/splunk_index.csv.example || true +rm -f /opt/syslog-ng/etc/context_templates/splunk_metadata.csv.example || true +if [ -f /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv ]; then + mv /opt/syslog-ng/etc/conf.d/local/context/splunk_index.csv /opt/syslog-ng/etc/conf.d/local/context/splunk_index.deprecated +fi cp --verbose -R /opt/syslog-ng/etc/local_config/* /opt/syslog-ng/etc/conf.d/local/config/ mkdir -p /opt/syslog-ng/var/log