I have a 2D array that I'd like to intersect with a 1D array and extract a specific columns of data. I want to search for the Supplier Barcode in the Barcode column of the 2D array and output the remaining data of the matched columns.
Barcode,Asin,Price1,Rank,Competitors,Price2
Supplier Barcodes
I'd like to match all of Arr1 Barcode data to Arr2 and give an output of remaining data in the match, essentially using the barcode as a comparison and identifier for the other columns.
Desired Output:
Asins: B01GEGER5C, B07Q2MQHYD
Price1:£69.96, £81.99
Rank: 228412, 69072
Competitors: 3,6
Price2:£70.02,£82.33
I've tried various methods such as nested loops and assigning objects, however I cannot figure out how to return the correct output. The code is only working for 1D arrays.
Arr1 = [
[ 5013314726035, 'B01GEGER5C', '£ 69.95', 228412, 3, '£ 70.02' ],
[ 5013314736485, 'B07Q2MQHYD', '£ 81.99', 69072, 6, '£ 82.33' ],
[ 5013314736423, 'B08NCKS53P', '£ 23.77', 739827, 1, '£ 23.77' ]
]
Arr2 = [ 5013314726035, 5013314736485, 5555555555555 ]
function getArraysIntersection(arr1,arr2){
for(var i = 0; i < arr2.length; i++){
return a2.filter(function(n){return a1.indexOf(n) !== -1;})
}
console.log(getArraysIntersection(Arr1,Arr2)
I know that hash doesn't store duplicate keys. But I want to know if that default behaviour can be changed according to requirement, Is that possible?
I will give the sample code here
keys_array = [ 'key1', 'key2' ]
values_array = [
{"A": { "id": "1" }},
{"B": { "id": "2" }}
]
results = keys_array.zip(values_array ).to_h
Here output is exactly what I wanted
{"key1"=>{:A=>{:id=>"1"}}, "key2"=>{:B=>{:id=>"2"}}}
But If the keys get repeated , for example
keys_array = [ 'key1', 'key1' ] in which 'key1' key is repeated,
result will be {"key1"=>{:B=>{:id=>"2"}}}
But I want
{"key1"=>{:A=>{:id=>"1"}}, "key1"=>{:B=>{:id=>"2"}}}
I know about grouping and all , but it will change the result format
So I don't want to use that
I want this result {"key1"=>{:A=>{:id=>"1"}}, "key1"=>{:B=>{:id=>"2"}}} where key1 is a repeated key.
Also please note that these array values are not fixed , its dynamic
Greatly appreciate your help!
You can alter the behavior of a hash with compare_by_identity.
h = Hash.new.compare_by_identity
h["key1"] = 1
h["key1"] = 2
p h #=> {"key1"=>1, "key1"=>2}
I've never seen a useful way for this.
The data structure you are looking for is called a multimap. It is, of course, possible to implement one in Ruby. You just have to do it. (Or find a library that does it for you.)
However, neither the core nor the standard library contain a multimap implementation.
No way with a hash, you could use an array of hashes instead:
[{"key1"=>{:A=>{:id=>"1"}}}, {"key1"=>{:B=>{:id=>"2"}}}]
I need to parse the following hash of 2d arrays, where the first array has the keys and the rest of the arrays has the values.
input = {
"result": [
[
"id",
"name",
"address"
],
[
"1",
"Vishnu",
"abc"
],
[
"2",
"Arun",
"def"
],
[
"3",
"Arjun",
"ghi"
]
]
}
This is the result I came up with.
input[:result].drop(1).collect{|arr| Hash[input[:result].first.zip arr]}
Here I'm iterating through the result array ignoring its first sub array (the one that contains keys) then zip the key array and value array to make a hash afterwards I collect the hash to another array.
The above solution gives me what I want which is a hash
[{"id"=>"1", "name"=>"Vishnu", "address"=>"abc"}, {"id"=>"2", "name"=>"Arun", "address"=>"def"}, {"id"=>"3", "name"=>"Arjun", "address"=>"ghi"}]
Is there a better way to achieve the same result?
zip is the correct tool here, so your code is fine.
I'd use Ruby's array decomposition feature to extract keys and values, and to_h instead of Hash[]:
keys, *values = input[:result]
values.map { |v| keys.zip(v).to_h }
Or, if you prefer a "one-liner": (harder to understand IMO)
input[:result].yield_self { |k, *vs| vs.map { |v| k.zip(v).to_h } }
I have a keyword array field (say f) and I want to filter documents with an exact array (e.g. filter docs with f = [1, 3, 6] exactly, same order and number of terms).
What is the best way of doing this?
Regards
One way to achieve this is to add a script to the query which would also check the number of elements in the array.
it script would be something like
"filters": [
{
"script": {
"script": "doc['f'].values.length == 3"
}
},
{
"terms": {
"f": [
1,
3,
6
],
"execution": "and"
}
}
]
Hope you get the idea.
I think an even better idea would be to store the array as a string (if there are not many changes to the structure of the graph) and matching the string directly. This would be much faster too.
I'm trying to push data to a REST api using powershell.
http://influxdb.com/docs/v0.8/api/reading_and_writing_data.html
The server expects data like so:
[
{
"name" : "hd_used",
"columns" : ["value", "host", "mount"],
"points" : [
[23.2, "serverA", "mnt"]
]
}
]
However, I"m only able to make a json object that looks like this (notice the extra quotes):
[
{
"name" : "hd_used",
"columns" : ["value", "host", "mount"],
"points" : [
"[23.2, "serverA", "mnt"]"
]
}
]
How can I construct the data into an array of arrays without wrapping the nested array in quotes?
This works, but it isn't a nested array
$influxdata = [ordered]#{}
$influxdata.name = $hd_used
$influxdata.columns = #("value", "host", "mount")
$influxdata.points = #()
$influxdata.points += #("23.2", "serverA", "mnt")
$influxdatajson = $influxdata | ConvertTo-Json -Depth 2
This works but the inner array is actually a string.
$influxdata = [ordered]#{}
$influxdata.name = $hd_used
$influxdata.columns = #("value", "host", "mount")
$influxdata.points = #()
$influxdata.points += #('["23.2", "serverA", "mnt"]')
$influxdatajson = $influxdata | ConvertTo-Json -Depth 2
With $PSVersion.$PSVersion equal to 3.0 and your exact input I get the following when I print the $influxdatajson variable:
{
"name": "hd_used",
"columns": [
"value",
"host",
"mount"
],
"points": [
"23.2",
"serverA",
"mnt"
]
}
Which clearly isn't what you want but isn't what you said you got either.
The reason that that is the output we get is because your attempt to add the array to the existing array didn't work the way you expect because of powershell's annoying tendency to unroll arrays (I think).
If you work around that oddity by using this syntax instead:
$influxdata.points += ,#("23.2", "serverA", "mnt")
(the leading , forces an array context so that outer array gets unrolled instead of the array you are trying to add)
then I get the following output from $influxdatajson:
{
"name": "hd_used",
"columns": [
"value",
"host",
"mount"
],
"points": [
[
"23.2",
"serverA",
"mnt"
]
]
}
To complement Etan Reisner's helpful answer (whose of use unary , to create a nested array solves the problem):
PowerShell's hashtable literals are quite flexible with respect to incorporating variable references, expression, and commands, which makes for a more readable solution:
,, [ordered] #{
name = 'hd_used'
columns = 'value', 'host', 'mount'
points = , (23.2, 'serverA', 'mnt')
} | ConvertTo-Json -Depth 3
This yields:
[
{
"name": "hd_used",
"columns": [
"value",
"host",
"mount"
],
"points": [
[
23.2,
"serverA",
"mnt"
]
]
}
]
Note how the array-construction operator (,) is applied twice at the beginning of the pipeline (before [ordered]):
first to turn the ordered hashtable into a (single-item) array,
and the 2nd time to wrap that array in an outer array.
Sending the result through the pipeline makes PowerShell unwrap any collection, i.e., enumerate the collection items and send them one by one, which in this case strips away the outer array, leaving ConvertTo-Json to process the inner array, as desired.
Note that passing an array adds a level to the hierarchy, which is why the -Depth value was increased to 3 above.
Caveat: Any property whose hierarchy level is deeper than -Depth is stringified (evaluated as if placed inside "$(...)"), which in the case of an array would simply join the array elements with a space; e.g., array 23.2, 'serverA', 'mnt' would turn to a single string with literal contents 23.2 serverA mnt.
Note how the arrays above do not use syntax #(...), because it is generally not necessary to construct arrays and is actually less efficient: simply ,-enumerate the elements, and, if necessary, enclose in (...) for precedence (although #() is syntactically convenient for creating an empty array).
+ with an array as the LHS doesn't so much unwrap (unroll) its RHS, but concatenates arrays, or, to put it differently, allows you to append multiple individual items; e.g.:
$a = 1, 2
$a += 3, 4 # add elements 3 and 4 to array 1, 2 to form array 1, 2, 3, 4
Note that the use of += actually creates a new array behind the scenes, given that arrays aren't resizable.
Just to add my two pence worth. Its is worth noting that everytime you touch an array and you know is a single item array, you must use the appropriate comma syntax. This is not well highlighted in articles I found on the subject.
Take this example case I wrote for a Pester test case:
A lesson about single item arrays, every time you touch a single item array, you must use the comma op. It is not a case of set it and forget it:
Mock get-AnswerAdvancedFn -ModuleName Elizium.Loopz {
$pairs = #(, #('Author', 'Douglas Madcap Adams'));
$first = $pairs[0];
Write-Host "=== SINGLE-ITEM: pairs.count: $($pairs.Count), first.count: $($first.Count)"
([PSCustomObject]#{ Pairs = $pairs })
}
The above won't work, because of the fault assigning $pairs to Pairs even though we've used the correct syntax setting $pairs to #(, #('Author', 'Douglas Madcap Adams'))
The following fixes this issue; everytime you touch the single item array, you must use the comma syntax, otherwise you give PowerShell another chance to flatten your array:
Mock get-AnswerAdvancedFn -ModuleName Elizium.Loopz {
$pairs = #(, #('Author', 'Douglas Madcap Adams'));
$first = $pairs[0];
Write-Host "=== SINGLE-ITEM: pairs.count: $($pairs.Count), first.count: $($first.Count)"
([PSCustomObject]#{ Pairs = , $pairs })
}
My test code ended up being this:
Mock get-AnswerAdvancedFn -ModuleName Elizium.Loopz {
([PSCustomObject]#{ Pairs = , #(, #('Author', 'Douglas Madcap Adams')) })
}
Note, we had to use the comma op twice and both of those are necessary