how can I scan data matrix barcode - ios6

I want to scan data matrix codes, I tried zBar SDK, but it do not scan data matrix. is there any way to scan data matrix by using zBar or is there any other free libraries ?.

You can use the Apple API :
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVMetadataMachineReadableCodeObject_Class/Reference/Reference.html
A more complete answer is in this post : Scanning barcode with AVCaptureMetadataOutput and AVFoundation

Related

Working with CSV imge data to perform CNN in Julia - format problem

I am trying to make a convolusional neural network on MNIST sign language dataset. It is provided in a CSV format where each row is one picture and there are 784 columns refering to a single pixel (the pictures have a size 28x28).
My problem is that in order to perform the algorithm I need to transpose my data to a different format, the same as is the format of a built-in ML dataset fashion MNIST, which is:
Array{Array{ColorTypes.Gray{FixedPointNumbers.Normed{UInt8,8}},2},1}
I would like to end up with the following format, where my data is joined with the encoded labels:
Array{Tuple{Array{Float32,4},Flux.OneHotMatrix{Array{Flux.OneHotVector,1}}},1}
I was trying to use reshape function to convert it to a 4-dimensional array, but all I get is:
7172×28×28×1 Array{Float64,4}
My labels are in the following (correct) format:
25×7172 Flux.OneHotMatrix{Array{Flux.OneHotVector,1}}
I understand that somehow the proper data format is an array in an array while my data is a simple array with 4 dimensions, but I can't figure out how to change that.
I am new to Julia and some of the code I am using has been written by someone else.

How can I speed up the search on a file of about 2 GB in C under Linux

I'm developing a C program under Linux which make a search on file which is large ~2GB.
The file consists of text rows terminated by '\n', each row consists of five fields '|' separated like a|b|c|d|e|.
Then i need to parse every row to accomplish the search.
The file is sorted by the field a, but the search is done, mainly, using field b and c as search keys !
I tried to use mapped file to speed up the search trough the file, but I did not get satisfactory results,
mainly - i think - for the reasons explained above.
Now I think to use an array, in which I insert the data already parsed in the struct, then I sort the array by
the fields b and c and the apply binary search only if the search keys are b and c, otherwise I use a sequential search.
Is it useful to use mapped memory to fill an array from a sequential file ?
Is it a good way to improve research?
Any suggestions are appreciated

Reading large Excel array into MATLAB

So I have a huge excel file with 210 columns from(A all the way to CK)-Each of these columns have 80000-300000 values.I want to read this into a MATLAB array.I have two problems:
1.Is there any way that I can loop through the letters iteratively(from A to CK)?
2.When I try to read the file as a whole it says no storage-but I am able to create a matrix of ones of size-300000*210....So Im a bit puzzled and dont know what to do..??
Thanks!!
Save in the .csv format from EXCEL, then use load -ascii in matlab.
You can loop through the Excel column with the option xlRange of xlsread (doc).
And both the output num = xlsread(...) and [num,txt,raw] = xlsread(...) duplicate the information being read in several variables (type edit xlsread in the command window).

Appending data to the same dataset in hdf5 in matlab

I have to put all the huge data together into a single dataset in hdf5. Now, the thing is, if you try:
>> hdf5write('hd', '/dataset1', [1;2;3])
>> hdf5write('hd', '/dataset1', [4;5;6], 'WriteMode', 'append')
??? Error using ==> hdf5writec
writeH5Dset: Dataset names must be unique when appending data.
As you can see, hdf5write will complain when you tried to append data to the same dataset. I've looked around and see one possible workaround is to grab your data from the dataset first, then concatenate the data right in matlab environment. This is not a problem for small data, of course. For this case, we are talking about gigabytes of data, and Matlab starts yelling out out of memory.
Because of this, what are my available options in this case?
Note: we do not have h5write function in our matlab version.
I believe the 'append' mode is to add datasets to an existing file.
hdf5write does not appear to support appending to existing datasets. Without the newer h5write function, your best bet would be to write a small utility with the low-level HDF5 library functions that are exposed with the H5* package functions.
To get you started, the doc page has an example on how to append to a datatset.
You cannot do it with hdf5write, however if your version of Matlab is not too old, you can do it with h5create and h5write. This example is drawn from the doc of h5write:
Append data to an unlimited data set.
h5create('myfile.h5','/DS3',[20 Inf],'ChunkSize',[5 5]);
for j = 1:10
data = j*ones(20,1);
start = [1 j];
count = [20 1];
h5write('myfile.h5','/DS3',data,start,count);
end
h5disp('myfile.h5');
For older versions of Matlab, it should be possible to do it using the Matlab's HDF5 low level API.

Sparse matrix conversion in C

I'm trying to develop a program in C to convert a sparse matrix file into a dense matrix. From what I've read, the best approach would be the use of linked lists but I have no experience with them and haven't found a good online resource explaining the subject. I'm not looking for a quick solution but rather a website or text source that can explain how the process works so I can apply it to this project. What resources I have seen, suggest using three arrays to handle the values in the matrix (The row, column, and individual value) and two arrays for the vector (one for the row, the other for the column). Thanks!
The file format you've specified is for a dense matrix. A 10x10 matrix with 100 elements is dense. A sparse matrix has fewer than n*m elements and all "missing" elements are assumed to be 0. The point of doing it this way is so that matrices that are almost all zero (which happens in a lot of applications) will use less space. But using a sparse matrix format to store a dense matrix will use far more space than just a plain array.
One common sparse matrix file format is called MatrixMarket and it looks very similar to what you described. The first line has three values, # of rows, # of columns, # of nonzero elements (called nnz). Then you have nnz lines of the actual elements in a triplet: (row #) (column #) (value)
If your sparse matrix is in a similar format then you don't need any sparse matrix in memory. Just scan the values and fill in your dense array directly.
If you do want to have a sparse matrix in memory then there are several options for how to store it. Triplets is the easiest, and it's just an in-memory version of the MatrixMarket file. 3 arrays, or 1 array of structs.
The most common structure for linear algebra operations is Compressed Sparse Columns (CSC) or Compressed Sparse Rows (CSR). I'll let you look that up, but if you want a C implementation to play with you should look at Tim Davis' CSparse. This is also how MatLAB stores sparse matrices, Tim was one of the people who wrote that part of MatLAB.
It sounds like a linked list may not be what you're looking for, but this site offers a pretty comprehensive tutorial on the subject. It may help shed some light on whether or not it would be appropriate for your problem... Good luck!

Resources