how to use cacti to monitor remote hosts - nagios

I have nagios installed in a server and it's monitoring different remote hosts using different plugins. But I am not able to view the process of each system in a graph format. Is it possible to use cacti for the same purpose? I just installed cacti on the same machine. But not sure how to install plugins and monitor different servers. Also just wanted to know can I use cacti as the frontend tool for Nagios? How cacti works
Can someone help me on this please.
Thanks

I'm not sure on how Cacti interacts with Nagios, but I do have the pnp4nagios plugin/extension installed and configured for one of my Nagios Instances which gives me a great overview in graphs for the services I monitor. (not all of them, but only those who are variable and are usefull to see in a graph) It's a really nice tool and not so hard to setup. I compiled it from source and it's install.php gives you great feedback on what to do next in the installation procedure. One thing they didn't mention was that you had to enable Includes in your Nagios instance's apache2 config file. (this is necessary if you want to use the SSI include in the Nagios CGI files. This SSI file contains jQuery Javascript definitions that will enable the popUp png graphs when you mouseover a graph in Nagios)
It also uses rrdtool (Round Robin Database files) which uses fixed size storage.
(could be beneficial if you have little space on your harddrive)

For nagios there is nagiosgraph, it generates graph for each services define in nagios, you just have to add config for nagiosgraph.
as for cacti, there is plugin called NPC, it is generate new tab on cacti which contains services define in nagios.

Related

Archer GRC Automated Deployments

I am trying to figure out if Automated Deployments for Archer GRC is possible for the On Prem version ?
Currently it is deployed manually.
Latest version of Archer (v6.8, v6.9) has limited API provided to allow package deployments, BUT last time I checked they don't allow mapping and partial installs (I can be wrong, so double check).
API is there, but functionality is limited to the point that I don't see how package installation can be automated via provided API. I hope that in the next Archer versions it will be extended to replicate the functionality available with manual package deployment (mapping, partial installs, and other options).
Technically, if you like complex and time consuming tasks, you can decode/parse the package installation page. Then you can write an application to simulate HTTP packets sent to Archer server simulating the package installation.
I'm not aware of any company doing something like this as of today.
If you write a product to implement proper Code/Configuration Version Control for RSA Archer, then you may be able to sell it as well :)
Good luck!

Is PAA a good candidate for automating wcm library deployment and setup in portal?

I have created a Web Content Management library for use in WebSphere Portal. At the moment I'm using import-wcm-data to import the library, then I need to add some additional propeties to 2-3 files on the server under Resource Environment Providers and then restart particular services so those changes are detected.
Can anyone explain the benefits of using a paa over writing a simple bash (or similar) script to automate this process?
I don't understand if I get any advantages when using paa, or is paa even capable of updating properties files and restarting services?
I have been working intensively with PAA files and I must say that it is a very stable way of deploying a app requirering multiple depl steps and components.
It does need a startup process but is well worth it in a multi server environment.
You can do all the tasks that you can do in a Ant file as well as using the wsadmin script interface. I only update res env settings and the such in WAS and do not touch any props files for that reason since all settings are stored in WAS.
In my experience, a PAA is not a good method if you're merely importing a content library.
I don't think I understand why you are doing the import manually and not syndicating, but even if there's a good reason not to syndicate, the PAA process was too involved and required too many precursor actions (deleting libraries, remove PAA, deploy PAA and then activate the portliest) to be a viable option for something as simple as importing a WCM library.
Since activating the portlets I was importing with the PAA was an extra step, I don't believe you can restart applications either.

How to "explore" group of servers?

I need to check a group of servers (Unix, Linux) to know what kind of services, software (also version) are running there (check it once for a while and store it in database).
The idea is to have always fresh info about whole environment - its constantly changing. Perhaps you can suggest some solution that is already there?
Currently i am thinking about using Nagios or Cacti + plugins but I am not sure if this solution will be optimal.
Nagios is a very powerful monitoring solution (the best for me) : Open source, Compatible with both linux & windows, reporting & notifications via emails/SMS, Nice interface, Many many plugins...etc I've already worked with & I was very satisfied.
Check Nico Largo's Forum for Install. If you are not familiar to linux command search for FAN : Fully Automated Nagios which is a .iso where nagios is already in.
If you have any trouble during install or configuration post your questions there : https://serverfault.com/
Given that you want to poll for information on the system that can change dynamically, I would look at Check_MK.
It originally started as a plugin for Nagios that would poll a server for running services and generate the necessary configs for monitoring anything it discovered. Since then, it has evolved into a complete monitoring solution that provides its own complete ui (still based on nagios core), so you are safe in running this if you are familiar with nagios already.
See the website: http://mathias-kettner.com/checkmk_monitoring_system.html
You may need to select that you wish to view the "English" perspective of the site on first visit.

Syncing a site with a local machine?

I do my web development and testing on my laptop running an installation of xampp - I upload things to my host, but I always go through cpanel's file manager to do it. I realize that there's definitely a better way to go about it, but I need to be pointed in the right direction to do so, also other tips on how to manage stuff would be appreciated.
FTP - can I keep my site's stuff synched to a local directory on htdocs so I can keep my site backed up on my computer yet update the site with whatever changes I make locally? Can anyone recommend a good client (preferably free) that I can use to do this?
Database stuff - how do I backup / sync databases in the same way? Ideally I'd like to do the same thing as with my files. Merge / upload whatever I've developed with a click or two. Is this possible? Is this wise?
Any help and advice would be appreciated. :)
I do my development in Eclipse which allows me to combine development and sync via FTP in one environment. It will also tell you if a file changed on the server and allow you to decide whether to override it or not. You can also disable the syncing of certain types of files with pattern matching and use other technologies like WebDAV or SSH to sync (if supported by your host of course).

how to integrate bugzilla and HP quality center?

I'm working on integrating Bugzilla with HP Qc. I'm performing this by using perl script by directly manipulating the database using sql commands. I want to use the web services of Bugzilla. I have gone through the Bugzilla webservice API but tat wasn't enough to get started. I'm a beginner and this is the first project of my career. How do I go about this?
Check out the Perl script bz_webservice_demo.pl in Bugzilla's contrib directory, it shows how to talk to Bugzilla via XMLRPC.
There are a few things you could do:
Export defects from Bugzilla into a spreadsheet and upload it into Quality Center
Use the Open Test Architecture API (OTAClient.dll) to update defects in Quality Center
Use the HP Synchronization Server and build an adapter
Using the HP Synchronizer is probably the only "real" way to do it. Though you could potentially build your own sync mechanism, potentially using just OTA and a message queue.
There may be an existing adapter available from proficom-ag based on a presentation I found via a web search

Resources