I am trying to integrated two systems using SNMP protocol, here is my question:
as you know to get queries from for example MySQL database i have ability to add conditions in query string and get required result, some thing like below:
SELECT * FROM table WHERE a='foo' AND b='bar'
is there any possibility to request queries using SNMP with conditions
It depends if you need to select data based on the mib instance or the mib data.
You can select data based on the instance, like:
snmpwalk -v2c -cpublic 1.2.3.4 ifOperStatus
This will give you all interface statuses in a device.
SNMP do not support getting only ifOperStatus = "up", in this case you need to retrieve all instance and make your selection
snmpwalk -v2c -cpublic 1.2.3.4 ifOperStatus | grep up
Another more over the top solution is to collect the data and store it in a database, then you can use the SQL syntax you mentioned in your question
Related
I use this codes manually
db.employees.update({Name: "Reg Rubio"}, {$set : {ReportingTo: ["Vice-President", "President"]}})
db.employees.update({Name: "Ian Tayao"}, {$set : {ReportingTo: ["Secretary", "Vice-President", "President"]}})
to update my data. Is their other way to to combine this in one function?
So.. no.. , there is no way to update two separate documents using an 2 different inputs in 1 db call.
To optimize network traffic you can use bulkWrite, this still executes each update separately in the db but it only sends the request over network once, reducing the overhead for all the traffic. This is the only optimization you can add and for only 2 calls I would say it's somewhat negligent.
I am trying to get the output of a powershell command into an array, but it seems not to work. In fact I want to address the output with a row and a column index.
e.g.
$a=Get-Service
With output (part)
Status Name DisplayName
------ ---- -----------
Stopped AeLookupSvc Application Experience
Stopped ALG Application Layer Gateway Service
Stopped AppIDSvc Application Identity
Running Appinfo Application Information
Stopped AppMgmt Application Management
I want to address for the second line the DisplayName, e.g.
$a[2][2]
And it should give then
Application Layer Gateway Service
But this does not seem to work.
Can anybody help?
This type of question makes me think that you're probably coming from a Unix background, and are accustomed to having to deal with indicies and column index, that sort of thing.
Fundamentally, PowerShell is an object-oriented scripting language. You simply don't need to do what you're asking about here.
For instance, if you want to capture the results, then grab a property for one of the objects, here's how that's done.
First, capture the output.
$a=Get-Service
Now, you want a particular property of a particular entity. To get that, index into the object you want.
>$a[2]
Status Name DisplayName
------ ---- -----------
Stopped AJRouter AllJoyn Router Service
To select the .DisplayName, all you have to do is append that to the end of your previous command.
> $a[2].DisplayName
AllJoyn Router Service
If you want to select multiple values, you could use this approach instead.
#Select multiple values from one entity
$a[2] | select DisplayName, Status
>DisplayName Status
----------- ------
Application Layer Gateway Service Stopped
#Select multiple values from each in the array
$a | select DisplayName, Status
>DisplayName Status
----------- ------
Adobe Acrobat Update Service Running
AllJoyn Router Service Stopped
Application Layer Gateway Service Stopped
Application Identity Stopped
This is not possible without a mapping from property names to array indices. Note that what you see in the output is just a partial list of properties (defined in an XML file somewhere). So there isn't even an easy way to convert those to array indices.
However, I also don't quite understand your need here. You can get the second service with $a[1], as expected. And then you can get its DisplayName property value with $a[1].DisplayName. PowerShell uses objects throughout. There is simply no need to fall back to text parsing or cryptic column indices just to get your data. There's an easier way.
The output from Get-Service that you see in the Console may look like an array (as it is formatted as a table when sent to the Console), but it is actually an 'System.ServiceProcess.ServiceController' object.
Rather than using row and column designations, you need to use the name of the property to retrieve it, so for your example:
$a[2].DisplayName will return Application Layer Gateway Service
I need to aggregate a number of csv inbound files in-memory, if necessary resequencing them, on Mule ESB CE 3.2.1.
How could I implement this kind of logics?
I tried with message-chunking-aggregator-router, but it fails on startup because xsd schema does not admit such a configuration:
<message-chunking-aggregator-router timeout="20000" failOnTimeout="false" >
<expression-message-info-mapping correlationIdExpression="#[header:correlation]"/>
</message-chunking-aggregator-router>
I've also tried to attach mine correlation ids to inbound messages, then process them by a custom-aggregator, but I've found that Mule internally uses a key made up of:
Serializable key=event.getId()+event.getMessage().getCorrelationSequence();//EventGroup:264
The internal id is everytime different (also if correlation sequence is correct): this way, Mule does not use only correlation sequence as I expected and same message is processed many times.
Finally, I can re-write a custom aggregator, but I would like to use a more consolidated technique.
Thanks in advance,
Gabriele
UPDATE
I've tried with message-chunk-aggregator but it doesn't fit my requisite, as it admits duplicates.
I try to detail the scenario I need to cover:
Mule polls (on a SFTP location)
file 1 "FIXEDPREFIX_1_of_2.zip" is detected and kept in memory somewhere (as an open SFTPStream, it's ok).
Some correlation info are mantained for grouping: group, sequence, group size.
file 1 "FIXEDPREFIX_1_of_2.zip" is detected again, but cannot be inserted because would be duplicated
file 2 "FIXEDPREFIX_2_of_2.zip" is detected, and correctly added
stated that group size has been reached, Mule routes MessageCollection with the correct set of messages
About point 2., I'm lucky enough to get info from filename and put them into MuleMessage::correlation* properties, so that subsequent components could use them.
I did, but duplicates are processed the same.
Thanks again
Gabriele
Here is the right router to use with Mule 3: http://www.mulesoft.org/documentation/display/MULE3USER/Routing+Message+Processors#RoutingMessageProcessors-MessageChunkAggregator
We use the DBAmp for integrating Salesforce.com with SQL Server (which basically adds a linked server), and are running queries against our SF data using OPENQUERY.
I'm trying to do some reporting against opportunities and want to return the created date of the opportunity in the opportunity owners local date time (i.e. the date time the user will see in salesforce).
Our dbamp configuration forces the dates to be UTC.
I stumbled across a date function (in the Salesforce documentation) that I thought might be some help, but I get an error when I try an use it so can't prove it, below is the example useage for the convertTimezone function:
SELECT HOUR_IN_DAY(convertTimezone(CreatedDate)), SUM(Amount)
FROM Opportunity
GROUP BY HOUR_IN_DAY(convertTimezone(CreatedDate))
Below is the error returned:
OLE DB provider "DBAmp.DBAmp" for linked server "SALESFORCE" returned message "Error 13005 : Error translating SQL statement: line 1:37: expecting "from", found '('".
Msg 7350, Level 16, State 2, Line 1
Cannot get the column information from OLE DB provider "DBAmp.DBAmp" for linked server "SALESFORCE".
Can you not use SOQL functions in OPENQUERY as below?
SELECT
*
FROM
OPENQUERY(SALESFORCE,'
SELECT HOUR_IN_DAY(convertTimezone(CreatedDate)), SUM(Amount)
FROM Opportunity
GROUP BY HOUR_IN_DAY(convertTimezone(CreatedDate))')
UPDATE:
I've just had some correspondence with Bill Emerson (I believe he is the creator of the DBAmp Integration Tool):
You should be able to use SOQL functions so I am not sure why you are
getting the parsing failure. I'll setup a test case and report back.
I'll update the post again when I hear back. Thanks
A new version of DBAmp (2.14.4) has just been released that fixes the issue with using ConvertTimezone in openquery.
Version 2.14.4
Code modified for better memory utilization
Added support for API 24.0 (SPRING 12)
Fixed issue with embedded question marks in string literals
Fixed issue with using ConvertTimezone in openquery
Fixed issue with "Invalid Numeric" when using aggregate functions in openquery
I'm fairly sure that because DBAmp uses SQL and not SOQL, SOQL functions would not be available, sorry.
You would need to expose this data some other way. Perhaps it's possible with a Salesforce report, web-service, or compiling the data through the program you are using to access the (DBAmp) SQL Server.
If you were to create a Salesforce web service, the following example might be helpful.
global class MyWebService
{
webservice static AggregateResult MyWebServiceMethod()
{
AggregateResult ar = [
SELECT
HOUR_IN_DAY(convertTimezone(CreatedDate)) Hour,
SUM(Amount) Amount
FROM Opportunity
GROUP BY HOUR_IN_DAY(convertTimezone(CreatedDate))];
system.debug(ar);
return ar;
}
}
How can you find out what are the long running queries are on Informix database server? I have a query that is using up the CPU and want to find out what the query is.
If the query is currently running watch the onstat -g act -r 1 output and look for items with an rstcb that is not 0
Running threads:
tid tcb rstcb prty status vp-class name
106 c0000000d4860950 0 2 running 107soc soctcppoll
107 c0000000d4881950 0 2 running 108soc soctcppoll
564457 c0000000d7f28250 c0000000d7afcf20 2 running 1cpu CDRD_10
In this example the third row is what is currently running. If you have multiple rows with non-zero rstcb values then watch for a bit looking for the one that is always or almost always there. That is most likely the session that your looking for.
c0000000d7afcf20 is the address that we're interested in for this example.
Use onstat -u | grep c0000000d7afcf20 to find the session
c0000000d7afcf20 Y--P--- 22887 informix - c0000000d5b0abd0 0 5 14060 3811
This gives you the session id which in our example is 22887. Use onstat -g ses 22887
to list info about that session. In my example it's a system session so there's nothing to see in the onstat -g ses output.
That's because the suggested answer is for DB2, not Informix.
The sysmaster database (a virtual relational database of Informix shared memory) will probably contain the information you seek. These pages might help you get started:
http://docs.rinet.ru/InforSmes/ch22/ch22.htm
http://www.informix.com.ua/articles/sysmast/sysmast.htm
Okay it took me a bit to work out how to connect to sysmaster. The JDBC connection string is:
jdbc:informix-sqli://dbserver.local:1526/sysmaster:INFORMIXSERVER=mydatabase
Where the port number is the same as when you are connecting to the actual database. That is if your connection string is:
jdbc:informix-sqli://database:1541/crm:INFORMIXSERVER=crmlive
Then the sysmaster connection string is:
jdbc:informix-sqli://database:1541/sysmaster:INFORMIXSERVER=crmlive
Also found this wiki page that contains a number of SQL queries for operating on the sysmaster tables.
SELECT ELAPSED_TIME_MIN,SUBSTR(AUTHID,1,10) AS AUTH_ID,
AGENT_ID, APPL_STATUS,SUBSTR(STMT_TEXT,1,20) AS SQL_TEXT
FROM SYSIBMADM.LONG_RUNNING_SQL
WHERE ELAPSED_TIME_MIN > 0
ORDER BY ELAPSED_TIME_MIN DESC
Credit: SQL to View Long Running Queries