How to use Splunk CIM Data Model - eval

I am trying to understand how to write an Eval statement in Splunk and the documentation isn't helpful. Specifically, I am looking at the Malware CIM Data Model there is a field called "Malware_Attacks" with prescribed values of: critical, high, medium and low. How do I create an eval statement using this CIM Field and the prescribed values and apply it into my regex I created separately in capture groups?
Thanks
Jack
I have tried the Splunk CIM Data Model documentation and it doesn't go into detail on prescribed values and how to apply the CIM Data Field in question on how to incorporate it into writing an eval statement

The severity field already exists in the datamodel (DM) so all you have to do is reference it. Do that by specifying the DM name and the field separated by a dot. For example, | table "Malware_Attacks.severity". Quotes are used to keep Splunk from treating this as two concatenated fields (since . is also the concatention operator). You can make life a little easier by using rename to remove the DM name so fields can be referenced directly.
| rename "Malware_Attacks.*" as *
| table severity

Related

Fairness Configuration schema for request payload

While setting the fairness configuration, we are only being asked about the label column in training data and not in the payload data. So, wanted to understand that when UI calls /v1/fairness_monitoring POST API what is the value that is being set as class_label in the request payload. Is it hard-coded to prediction or something else is being done?
WML has a concept of output schema which defines the schema of the output of WML scoring end point and the role for the different columns. The roles are used to identify which column contains the prediction value, which column contains the prediction probability, and the class label value, etc. The output schema is automatically set for models created using model builder. It can also be set using the WML python client. WML also sets the output schema when data is sent for scoring.
When the output schema identifies the column which contains the prediction, the UI does not ask for this information. The name of the column is hence not hardcoded but can be anything that is defined in the output schema. If output schema does not identify the prediction column, then the UI will show a screen allowing the user to select the prediction column.
The documentation for the WML python client is available at: http://wml-api-pyclient-dev.mybluemix.net/#repository. Search for “OUTPUT_DATA_SCHEMA” to understand the output schema and the API to use is to store_model API which accepts the OUTPUT_DATA_SCHEMA as a parameter.

How to apply same data manipulation codes to a group of SSIS components' inputs?

I am new to SSIS.
I have a number of MS access tables to transform to SQL. Some of these tables have datetime fields needed to go under some rules before sitting in respected SQL tables. I want to use Script component that deals with these kind of fields converting them to the desired values.
Since all of these fields need same modification rules, I want to apply the same code base to all of them thus avoiding the code duplication. What would be the best option for this scenario?
I know I can't use the same Script Component and direct all of those datasets outputs to it because unfortunately it doesn't support multi-inputs . So the question is is it possible to apply a set of generic data manipulation rules
on a group of different datasets' fields without repeating the rules. I can use a Script component for each ole db input and apply the same rule on them each. But it would not be an efficient way of doing that.
Any help would be highly appreciated.
SQL Server Integration Services has a specific task to suit this need, called a Data Conversion Transformation. This can be accomplished on the data source or via the task, as noted here.
You can also use the Derived Column transformation to convert data. This transformation is also simple, select an input column and then chose whether to replace this column or create a new output column. Then you apply an expression for the output column.
So why use one over the other?
The Data Conversion transformation (Pictured Below) will take an input, convert the type and provide a new output column. If you use the Derived Column transformation, you get to apply an expression to the data, which allows you to do more complex manipulations on the data.

Extract the real value of the relationship lookup field in soql

I'm a newbie to soql relationship queries. I've been trying to to get this working but not sure if this is even possible.
Select Id, Name, CustomObject__r.name From Account where customobject__c != null
Gives me a result that has a column of 'customobject__c', but which I have to click on to see the real value. I want to see the real value in the table, so that I can extract it to a csv. CustomObject is a lookup field in Account.
Is this possible? Thanks.
You need a better SOQL editor. What are you using at the moment?
Data Loader (officially supported tool) should give you these full relationship "path names" when you export data but it's a bit too clunky to be used on daily basis. You'd be designing your queries in one tool and pasting them to DL...
I'm a big fan of RealForceExplorer ;) See How to add comments in SOQL for some hints and pick your favorite editor.

Show Html entities like & in report parameter as &

Is there an effective and/or fast way to show Html entities like & or ë as & and ë in an report viewer's parameter drop down list?
I have html encoded values in the database and now I need to show the original value in parameter drop down list
Or does MSSQL have an equivalent for this?
SQL Server doesn't have a built-in HTML decode function. Your options as I see it are
A series of REPLACEs as long as Earth to Mars
A CLR module as a SQL Function (I would use this)
Programming change to store another column with the un-encoded html string when inserting
To not loose any performance on the reporting, the users must change these characters in the GUI in order not to see them anymore...
No other suitable solution found for this situation. An mass update on all text columns was an other solution as well, but not really necessary.

Fastest way to map a list of names in an excel doc to their IDs in a lookup table?

For one of my projects I display a list of counties in a drop down list (this list comes from a lookup table containing all counties). The client just requested that I limit it to a subset of their choice. The subset was given to me in an excel spreadsheet containing only names (seen below):
I'm trying to figure out the quickest way possible for me to map each of these to their corresponding id in the original lookup table. The client cannot give me these IDs. The names in here match the names in my table (except for the case).
This will most likely be a one time only thing.
Can anyone suggest a fast way to get these values into a query so I don't have to manually do it?
When I say fast I'm not talking about processing speed, just the fastest start to finish time that results in me getting the corresponding IDs using any tool available.
Note: I'm aware that I could have probably done this manually in the time it will take to get an answer, but I'd like to know for future reference.
You could do an External Data Query into another Excel sheet with
SELECT countryname, countryid FROM countries
then use a VLOOKUP to get the ids into the client provided sheet
=VLOOKUP(A1,Sheet2!$A$1:$B$200,2,FALSE)
See http://excelusergroup.org/blogs/nickhodge/archive/2008/11/04/excel-2007-getting-external-data.aspx for creating External Data Table in Excel 2007. Skip down to the "Other Sources" part.
Put the list in a text file. Write a powershell script which will get the contents of that file and then query your database to output the keys. Here is a rough, rough example.
get-content c:\list.txt | ForEach-Object {invoke-sqlcmd -E -query"select blah blah where county =" _$} | Format-Table
If you have access to SSIS, you could probably do a join between the excel source and your table.
You could load the excel sheet in to a temp table to take advantage of all your SQL query knowledge.
I believe (and yes it is true) that SQL can create a linked server out of a spreadsheet. Then you get to joining and you're done.

Resources