New to CoreML, trying to understand some basic concepts.
I'm working on a model with an output of:
float32 [1,896,16]
When using the model, and getting the output as MLMultiArray I get the following:
let output = prediction.regressors // MLMultiArray
print(output.debugDescription) // Float32 1 x 896 x 16 array
print(output.count) // 14336, which is 896x16
And I can access each of the elements simply using output[0]..output[1]..
Is it true for any data type that'll be stored in the MLMultiArray? Is it a "convenient" that Swift supplies us?
If flatten array is the case, will it be ordered at the same order of the matrix?
Is MLMultiArray a flatten array of the model output matrix?
No it's not it's a multidimensional array of the given dimensions
You can convert it to array like below and it should be the same order of the matrix.
let length = output.count
let doublePtr = output.dataPointer.bindMemory(to: Double.self, capacity: length)
let doubleBuffer = UnsafeBufferPointer(start: doublePtr, count: length)
let outputArray = Array(doubleBuffer)
For Data types apple documentations shows 3 types only: Here
case int32
Represents the integer type for multidimensional arrays and is
commonly used for text encoding.
case float32
Represents the float type in multidimensional arrays.
case double
Represents the double type for multidimensional arrays.
Related
I am new to Swift and am struggling to work out how to determine the size of a multidimensional array.
I can use the count function for single arrays, however when i create a matrix/multidimensional array, the output for the count call just gives a single value.
var a = [[1,2,3],[3,4,5]]
var c: Int
c = a.count
print(c)
2
The above matrix 'a' clearly has 2 rows and 3 columns, is there any way to output this correct size.
In Matlab this is a simple task with the following line of code,
a = [1,2,3;3,4,5]
size(a)
ans =
2 3
Is there a simple equivalent in Swift
I have looked high and low for a solution and cant seem to find exactly what i am after.
Thanks
- HB
Because 2D arrays in swift can have subarrays with different lengths. There is no "matrix" type.
let arr = [
[1,2,3,4,5],
[1,2,3],
[2,3,4,5],
]
So the concept of "rows" and "columns" does not exist. There's only count.
If you want to count all the elements in the subarrays, (in the above case, 12), you can flat map it and then count:
arr.flatMap { $0 }.count
If you are sure that your array is a matrix, you can do this:
let rows = arr.count
let columns = arr[0].count // 0 is an arbitrary value
You must ask the size of a specific row of your array to get column sizes :
print("\(a.count) \(a[0].count)")
If you are trying to find the length of 2D array which in this case the number of rows (or # of subarrays Ex.[1,2,3]) you may use this trick: # of total elements that can be found using:
a.flatMap { $0 }.count //a is the array name
over # of elements in one row using:
a[0].count //so elemints has to be equal in each subarray
so your code to get the length of 2D array with equal number of element in each subarray and store it in constant arrayLength is:
let arrayLength = (((a.flatMap { $0 }.count ) / (a[0].count))) //a is the array name
Recent versions of MATLAB have strings, which are N-Dimensional matrices of character vectors. I have a cell array of such 1D strings that I would like to combine into a single 2D string, but I am having a lot of trouble doing so. The join, strjoin and strcat functions work on the characters arrays inside the string, and cell2mat doesn't work:
>> cell2mat({strings(1, 4); strings(1, 4)})
Error using cell2mat (line 52)
CELL2MAT does not support cell arrays containing cell arrays or objects.
Is there any good way to do this? I expect the output in the above case to be a 2x1 string object.
string objects behave just like any other datatype (double, char, etc.) when it comes to concatenation with the same type. As long as you want the result to also be a string object, use normal concatenation.
result = [strings(1, 4); strings(1, 4)];
Or you can use cat or vertcat to be more explicit
result = cat(1, strings(1, 4), strings(1, 4));
result = vertcat(strings(1, 4), strings(1, 4));
Alternately you could use indexing to sample the same element twice
result = strings([1 1], 4);
If your data is already in a cell array, then you can use {:} indexing to generate a comma-separated list that you can pass to cat
C = {string('one'), string('two')};
result = cat(1, C{:})
As a side-note, there is no such thing as a 1-dimensional array in MATLAB. All arrays are at least two dimensions (one of which can be 1).
I kinda get confused between the two. I know a dictionary is initialized like:
var Dictionary=[Int:Int]
and is like [1:100,2:150] or [1:"cat",2:"dog"].
But I'm kinda of confused about 2d arrays. I think it is initialized like:
var Arr=[Int[Int]]()
and looks like [[1,4],[5,3]] and It should have an x and y coordinate.
I was wondering if the two are interchangeable and which would work best when making this grid? And if I wanted to store information at coordinate [2,1]=6 which should I use?
[0,1,0,0]
[0,1,1,0]
[1,0,0,1]
Don't get confused by the fact that both arrays and dictionaries use [ ] symbols for their literals. The two are completely different. a dictionary is used when you have a set of keys and each key has an associated value. A 2d array is essentially a matrix of values.
For your little matrix you want a 2d array. You can create and initialize a 2d array like this:
var matrix : [[Int]] = Array(repeating: Array(repeating: 0, count: 10), count: 10)
This would create a 10x10 matrix filled with zeros. To set a value you can do:
matrix[x][y] = 6
as long as x is 0 to rows - 1 and y is 0 to columns - 1.
Or read a value like:
let value = matrix[x][y]
So my main objective is to take a matrix of form
matrix = [a, 1; b, 2; c, 3]
and a list of identifiers in matrix[:,1]
list = [a; c]
and generate a new matrix
new_matrix = [a, 1;c, 3]
My problem is I need to import the data that would be used in 'matrix' from a tab-delimited text file. To get this data into Matlab I use the code:
matrix_open = fopen(fn_matrix, 'r');
matrix = textscan(matrix_open, '%c %d', 'Delimiter', '\t');
which outputs a cell array of two 3x1 arrays. I want to get this into one 3x2 matrix where the first column is a character, and the second column an integer (these data formats will be different in my implementation).
So far I've tried the code:
matrix_1 = cell2mat(matrix(1,1));
matrix_2 = cell2mat(matrix(1,2));
matrix = horzcat(matrix_1, matrix_2)
but this is returning a 3x2 matrix where the second column is empty.
If I just use
cell2mat(matrix)
it says it can't do it because of the different data formats.
Thanks!
This is the help of matlab for the cell2mat function:
cell2mat Convert the contents of a cell array into a single matrix.
M = cell2mat(C) converts a multidimensional cell array with contents of
the same data type into a single matrix. The contents of C must be able
to concatenate into a hyperrectangle. Moreover, for each pair of
neighboring cells, the dimensions of the cell's contents must match,
excluding the dimension in which the cells are neighbors. This constraint
must hold true for neighboring cells along all of the cell array's
dimensions.
From what I understand the contents you want to put in a matrix should be of the same type otherwise why do you want a matrix? you could simply create a new cell array.
It's not possible to have a normal matrix with characters and numbers. That's why cell2mat won't work here. But you can store different datatypes in a cell-array. Use cellstr for the strings/characters and num2cell for the integers to convert the contents of matrix. If you have other datatypes, use an appropriate function for this step. Then assign them to the columns of an empty cell-array.
Here is the code:
fn_matrix = 'data.txt';
matrix_open = fopen(fn_matrix, 'r');
matrix = textscan(matrix_open, '%c %d', 'Delimiter', '\t');
X = cell(size(matrix{1},1),2);
X(:,1) = cellstr(matrix{1});
X(:,2) = num2cell(matrix{2});
The result:
X =
'a' [1]
'b' [2]
'c' [3]
Now we can do the second part of the question. Extracting the entries where the letter matches with one of the list. Therefore you can use ismember and logical indexing like this:
list = ['a'; 'c'];
sel = ismember(X(:,1),list);
Y(:,1) = X(sel,1);
Y(:,2) = X(sel,2);
The result here:
Y =
'a' [1]
'c' [3]
How can I get hold of a column or row of a 2D Array in F# (ideally as a 1D array, but a Seq would be nice as well). Obviously I could write it myself, but you would think it must be already provided...
E.g. I am after built-in equivalent for:
let row i array = seq { for j in 0 .. (Array2D.length2 array)-1 do yield array.[i,j]}
I don't think there is a built-in function for this.
You could slice the array and flatten the slice using Seq.cast:
let row i (arr: 'T[,]) = arr.[i..i, *] |> Seq.cast<'T>