DataDog directory - listen multiple directories at one host - analytics

I need to know is there a possibility to listen to two directories at the same time on one host.
Directory (datadog.com)
I used an example document system.disk.directory - Configuration example for initial setup.
One directory is fine, but I need to listen to two directories.
init_config:
instances:
- directory: /first/directory/path/
filegauges: true
# second one is ignored?
- directory: /second/directory/path/
filegauges: true
Is this even possible?

Yes that will work fine with the syntax you're using. You can add as many instances to that array as you want

Related

SymmetricDS : File sync on same filename to different path

I have 2 files with same name but sync to different path at the target, but only 1 file get synced down at the target. I noticed in file_snapshot CRC32_CHECKSUM are the same for both filename. Is it the correct behavior ?
There was a fix for this that will be released in 3.8.28 but here is a link to the patch for 3.8.27. You will also need two routers and two file routers (which you may already have).
https://support.jumpmind.com/downloads/symmetricds/patches/3.8/patch-0003197.jar

Multiple commands on Nagios remote host

I have a machine monitored by nagios that has the following line in nrpe.cfg
include_dir=/etc/nagios/cfgs/
The 'cfgs' dir consists of following files with various commands.
servers.cfg
database.cfg
regular.cfg
However, some commands are repeated among these cfg files but they differ in the arguments. Eg: check_disk command on database has different threshold and on servers have different threshold for the same command.
My question is: If there are multiple definitions of a command, which command is picked up by nrpe? Is there some way to identify it?
You generally can not have duplication in command names in your nrpe.cfg files - though no error is thrown if there are duplicates. I've had this happen before (accidentally) with very large configs, and nrpd has always chosen to use the very LAST matching command it found. In my case, being a single config file, it was always easy to spot the LAST matching command in the file. With many included files, this could be an issue. I believe it would load these files in alphabetical order with the include_dir command you provided.
In any case, I'd look for more unique names for these duplicates. Such as 'check_disk_db' for the one checking database disks. These names can be just about anything you want them to be - but remember to change them on the Nagios side as well. Using unique command names would make your life as a Nagios admin much easier.
As far as figuring out which is actually being used. You can turn on debugging. The debugging output doesn't include the file the command came from, but does show you what options were passed and exactly which actual commands where executed via the check command. This info should be enough to see which nrpe command was selected.

How do I (recursively?) monitor contents of new directories using inotify?

Firstly, I want to start by using inotify to monitor a specific directory (the main directory) for files and sub-directories. If a new directory is added into this main directory, how would I make sure to monitor this sub-directory with inotify? How would I monitor a new directory within this sub-directory of the main directory?
I think adding it to the watch is easy by using the inotify_add_watch() function but I do not know how to get the correct relative path address of files and directories within sub-directories (to use for like Dropbox-like syncing in a different location while maintaining the correct directory tree, for example).
Well the fastest to implement (but not the fastest in reality) would be to:
Create the initial tree of directories by recursively exploring the children; An example in C/Linux can be found here:
http://www.lemoda.net/c/recursive-directory/
Add a watch for each subdirectory; When something has been modified or changed you can parse all children recursively and see the differences. Something similar was discussed here:
How to monitor a folder with all subfolders and files inside?
If this solution doesn't appeal to you, you might try to do a polling mechanism such that you must re-check the whole structure using a thread at a certain time interval.
Hope it helps!

How to process only the last file in a directory using Apache Camel's file component

I have a directory with files likes this:
inbox/
data.20130813T1921.json
data.20130818T0123.json
data.20130901T1342.json
I'm using Apache Camel 2.11 and on process start, I only want to process one file: the latest. The other files can actually be ignored. Alternatively, the older files can be deleted once a new file has been processed.
I'm configuring my component using the following, but it obviously doesn't do what I need:
file:inbox/?noop=true
noop does keep the last file, but also all other files. On startup, Camel processes all existing files, which is more than I need.
What is the best way to only process the latest file?
You can use the sorting and then sort by name, and possible need to reverse it so the latest is first / last. You can try it out to see which one you need. And then set maxMessagesPerPoll=1 to only pickup one file. And you need to set eagerMaxMessagesPerPoll=false to allow to sort before limiting the number of files.
You can find details at: http://camel.apache.org/file2. See the section Sorting using sortBy for the sorting.
An alternative would be to still using the sorting to ensure the latest file is last. Then you can use the aggregator EIP to aggregate all the files, and use org.apache.camel.processor.aggregate.UseLatestAggregationStrategy as the aggregation strategy to only keep the last (which would be the latest file). Then you can instruct the file endpoint to delete=true to delete the files when done. You would then also need to configure the aggregator to completionFromBatchConsumer=true.
The aggregator eip is documented here: http://camel.apache.org/aggregator2

Copying files multiple times to different destination folders

I want to deploy a plugin for Autodesk 3ds Max with an Inno-Setup. If the end user has 3ds Max installed multiple times (in different folders) for having different languages (3ds Max isnt multilingual) I want the setup to copy the plugin-files to the several 3ds Max folders, but without copying the [Files] Source: "plugin.dll" DestDir: "..." . Can it be done programmatically with the pascal scripting, that the files are copied again but always with changing destination folders?
Why don't you want multiple [Files] entries? They will only be included in the setup once.
If you want to copy them in [Code], you can use the FileCopy() function in either the AfterInstall handler function for the file, or the CurStepChanged(ssPostInstall) event function.
Note that if you install them manually, you lose all automatic reference counting, registration, and uninstall handling, all of which you'll need to replicate in code.

Resources