Is it possible to search at first inside the file after an specific byte and find the position and read just the bytes from the file in until that specific byte?
At the moment it is just possible for me to read some bytes or the whole file in and afterwards search for that specific byte.
like this:
local function read_file(path)
local file = open(path, "r") -- r read mode and b binary mode
if not file then return nil end
local content = file:read(64) -- reading 64 bytes
file:close()
return content
end
local fileContent = read_file("../test/l_0.dat");
print(fileContent)
function parse(line)
if line then
len = 1
a = line:find("V", len +1) --find V in content
return a
else
return false
end
end
a = parse(fileContent) --position of V in content
print(a)
print(string.sub(fileContent, a)) -- content until first found V
In this example i find at position 21 the first V. So it would be cool to read in only 21 bytes except of 64 bytes or the whole file. But then i need to find the position before reading something in. Is this possible ? (The 21byte are variable, it could be 20 or 50 or so on)
You can specify a file position using file:seek and read a certain number of characters (bytes) by providing an integer to file:read
local file = file:open(somePath)
if file then
-- set cursor to -5 bytes from the file's end
file:seek("end", -5)
-- read 3 bytes
print(file:read(3))
file:close()
end
You cannot search in a file without reading it. If you don't want to read the entire file you can read it in chunks either by reading it linewise (if there are lines in your file) or by reading a specific number of bytes each time until you find something.
Of course you can also read it byte-wise.
You can argue if it makes more sense to read a 64 byte file as a whole or in chunks. I mean in most scenarios you won't notice any difference.
So you could file:read(1) in a loop that terminates once you found a V or reach the end of the file.
local file = io.open(somePath)
if file then
local data = ""
for i = 1, 64 do
local b = file:read(1)
if not b then print("no V in file") data = nil break end
data = data .. b
if b == "V" then print(data) break end
end
file:close()
end
vs
local file = io.open("d:/test.txt", "r")
if file then
local data = file:read("a")
local pos = data:find("V")
if pos then
print(data:sub(1, pos))
end
file:close()
end
(Or) Correct your code to...
local function read_file(path)
local file = io.open(path, "r") -- r read mode and b binary mode
if not file then return nil end
local content = file:read(64) -- reading 64 bytes
file:close()
return content
end
local fileContent = read_file("test/l_0.dat") -- '../' causing error
print(fileContent)
local function parse(line)
if line then
local len = 1
local a = line:find("V", len +1) --find V in content
return a
else
return false
end
end
print(fileContent:sub(1, parse(fileContent))) -- content until first found V
That puts out...
0123456789VabcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ
0123456789V
If you want that V is a (single) delimiter you probably dont want to put it out.
Meet the strength of string.sub(text, start, stop)...
print(fileContent:sub(1, parse(fileContent) - 1)) -- before V
-- 0123456789
print(fileContent:sub(parse(fileContent) + 1, -1)) -- after V
-- abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ
Related
A simple question. I have 1 file test.txt in userPath().."/log/test.txt with 15 line
I wish read first line and remove first line and finally file test.txt with 14 line
local iFile = 'the\\path\\test.txt'
local contentRead = {}
local i = 1
file = io.open(iFile, 'r')
for lines in file:lines() do
if i ~= 1 then
table.insert(contentRead, lines)
else
i = i + 1 -- this will prevent us from collecting the first line
print(lines) -- just in case you want to display the first line before deleting it
end
end
io.close(file)
local file = io.open(iFile, 'w')
for _,v in ipairs(contentRead) do
file:write(v.."\n")
end
io.close(file)
there must be other ways to simplify this, but basically what I did in the code was:
Open the file in reading mode, and store all lines of text except the first line in the table contentRead
I opened the file again, but this time in Write mode, causing the entire contents of the file to be erased, and then, I rewrote all the contents stored in the table contentRead in the file.
Thus, the first line of the file was "deleted" and only the other 14 lines remained
I have given a question to write a function "that returns a count of the number of characters in the file whose name is given as a parameter."
So if a file called "data.txt" contains "Hi there!" and is printed by using my codes from below, it will return value of 10. (which is correct)
"""Attemping Question 7.
Author: Ark
Date: 28/04/2015
"""
def file_size(filename):
"""extracts word from a line"""
filename = open(filename, 'r')
for line in filename:
result = len(line) #count number of characters in a line.
return result
However, let say I have made another file called "data2.txt" and it contains
EEEEE
DDDD
CCC
BB
A
If I print this out it would give the value of 6. So, my challenge starts here.. what can I do with my coding to read the lines and add them all up?
print(file_size("data2.txt"))
expected 16 words (?)
You must sum the lengths of the lines, right now you return the length of the very first line.
Also, you must strip a trailing newline if it's there. This should work:
def character_count(filename):
with open(filename) as f:
return sum(len(line.rstrip("\n")) for line in f)
Ok, so In my program I'm supposed to take in the name of the data file from the user than open it and read the contents. But when I open and read it all that characters just end up being ****** and all the integers end up being 0. IDK if it's how I'm reading in the file or the format?
The file will contain something like this: (where the number of cities is the first number)
4
SanDiego
0
350
900
1100
Phoenix
350
0
560
604
Denver
900
560
0
389
Dallas
1100
604
389
0
So far my code is this where first I take in the first number than on every firstnumber * I + I pass is supposed to go into the character array city. Now the rest of the numbers I am storing into a integer array, but really want it into a integer matrix called d_table but I couldn't think of a way to do that immediately on the read.
PROGRAM p4
IMPLICIT NONE
INTEGER :: number, status, I, J, K, permutation = 0, distance = 0, distance = 999999
CHARACTER(50) :: filename ! Filenames longer than 50 are truncated
CHARACTER(20), DIMENSION(10) :: city
INTEGER, DIMENSION(100) :: temp
INTEGER, DIMENSION(10,10) :: d_table
INTEGER, DIMENSION(10) :: path, best_path
WRITE (*, '(1x,A)', ADVANCE="NO") "Enter filename: "
READ *, filename
! Open the file we created and read the contents
OPEN(UNIT=15, FILE=filename, STATUS="OLD", ACTION="READ",&
IOSTAT=status)
IF(status /= 0) THEN
PRINT *, "ERROR, could not open file for reading."
STOP
END IF
READ (UNIT=15, FMT = 100, IOSTAT=status) number
J = 0
K = 0
DO I = 0, number*number
IF(I == J*number+J) THEN
READ (UNIT=15, FMT = 200, IOSTAT=status) city(J)
J = J + 1
ELSE
READ (UNIT=15, FMT = 100, IOSTAT=status) temp(K)
K = K + 1
END IF
END DO
K = 0
DO I = 0, number
DO J = 0, number
d_table(I,J) = temp(K)
K = K + 1
END DO
END DO
100 FORMAT(I6)
200 FORMAT (A)
END PROGRAM p4
This line
DO I = 0, number*number
looks wonky to me; the loop will be executed 17 times. Surely you want to read number groups of 5 lines, each group being one city name followed by four integers ? That would be a good case for a little loop nest, something like
do ix = 1, number
read(15,*) city(ix)
do jx = 1, 4
read(15,*) d_table(ix,jx)
end do
end do
Given such a simple input file format there's no need to bother with format statements, list-directed input will work just fine.
I can't see the point of all the index arithmetic the code is doing, perhaps I've missed something.
Supposing I've got a file filled with data separated by a tabulation (in this format):
1 2
3 4
5 6
and I'd like to read these data in pairs from a single line, so that result would be as follows:
var1=1;
var2=2;
var1=3;
var2=4;
var1=5;
var2=6;
How am I supposed to work? Right now my code works for a single data on a single line (see following):
read_from_file: process(clk)
variable input_data: natural;
variable ILine: line;
begin
for i in 0 to NSAMPLES-1 loop
readline (input_vectors, ILine);
read(ILine, input_data);
end loop;
end process;
Thank you for helping!
You should be able to call the read function again.
For example:
read_from_file: process(clk)
variable input_data_column1: natural;
variable input_data_column2: natural;
variable ILine: line;
begin
for i in 0 to NSAMPLES-1 loop
readline (input_vectors, ILine);
read(ILine, input_data_column1);
read(ILine, input_data_column2);
-- Now you can do things like conv_std_logic_vector(input_data_column1, bitwidth) etc
end loop;
end process;
Note: I am not positive if any whitespace will be accepted. Spaces have worked for me.
I have a CSV file, I want to read this file and do some pre-calculations on each row to see for example that row is useful for me or not and if yes I save it to a new CSV file.
can someone give me an example?
in more details this is how my data looks like: (string,float,float) the numbers are coordinates.
ABC,51.9358183333333,4.183255
ABC,51.9353866666667,4.1841
ABC,51.9351716666667,4.184565
ABC,51.9343083333333,4.186425
ABC,51.9343083333333,4.186425
ABC,51.9340916666667,4.18688333333333
basically i want to save the rows that have for distances more than 50 or 50 in a new file.the string field should also be copied.
thanks
You could actually use xlsread to accomplish this. After first placing your sample data above in a file 'input_file.csv', here is an example for how you can get the numeric values, text values, and the raw data in the file from the three outputs from xlsread:
>> [numData,textData,rawData] = xlsread('input_file.csv')
numData = % An array of the numeric values from the file
51.9358 4.1833
51.9354 4.1841
51.9352 4.1846
51.9343 4.1864
51.9343 4.1864
51.9341 4.1869
textData = % A cell array of strings for the text values from the file
'ABC'
'ABC'
'ABC'
'ABC'
'ABC'
'ABC'
rawData = % All the data from the file (numeric and text) in a cell array
'ABC' [51.9358] [4.1833]
'ABC' [51.9354] [4.1841]
'ABC' [51.9352] [4.1846]
'ABC' [51.9343] [4.1864]
'ABC' [51.9343] [4.1864]
'ABC' [51.9341] [4.1869]
You can then perform whatever processing you need to on the numeric data, then resave a subset of the rows of data to a new file using xlswrite. Here's an example:
index = sqrt(sum(numData.^2,2)) >= 50; % Find the rows where the point is
% at a distance of 50 or greater
% from the origin
xlswrite('output_file.csv',rawData(index,:)); % Write those rows to a new file
If you really want to process your file line by line, a solution might be to use fgetl:
Open the data file with fopen
Read the next line into a character array using fgetl
Retreive the data you need using sscanf on the character array you just read
Perform any relevant test
Output what you want to another file
Back to point 2 if you haven't reached the end of your file.
Unlike the previous answer, this is not very much in the style of Matlab but it might be more efficient on very large files.
Hope this will help.
You cannot read text strings with csvread.
Here is another solution:
fid1 = fopen('test.csv','r'); %# open csv file for reading
fid2 = fopen('new.csv','w'); %# open new csv file
while ~feof(fid1)
line = fgets(fid1); %# read line by line
A = sscanf(line,'%*[^,],%f,%f'); %# sscanf can read only numeric data :(
if A(2)<4.185 %# test the values
fprintf(fid2,'%s',line); %# write the line to the new file
end
end
fclose(fid1);
fclose(fid2);
Just read it in to MATLAB in one block
fid = fopen('file.csv');
data=textscan(fid,'%s %f %f','delimiter',',');
fclose(fid);
You can then process it using logical addressing
ind50 = data{2}>=50 ;
ind50 is then an index of the rows where column 2 is greater than 50. So
data{1}(ind50)
will list all the strings for the rows of interest.
Then just use fprintf to write out your data to the new file
here is the doc to read a csv : http://www.mathworks.com/access/helpdesk/help/techdoc/ref/csvread.html
and to write : http://www.mathworks.com/access/helpdesk/help/techdoc/ref/csvwrite.html
EDIT
An example that works :
file.csv :
1,50,4.1
2,49,4.2
3,30,4.1
4,71,4.9
5,51,4.5
6,61,4.1
the code :
File = csvread('file.csv')
[m,n] = size(File)
index=1
temp=0
for i = 1:m
if (File(i,2)>=50)
temp = temp + 1
end
end
Matrix = zeros(temp, 3)
for j = 1:m
if (File(j,2)>=50)
Matrix(index,1) = File(j,1)
Matrix(index,2) = File(j,2)
Matrix(index,3) = File(j,3)
index = index + 1
end
end
csvwrite('outputFile.csv',Matrix)
and the output file result :
1,50,4.1
4,71,4.9
5,51,4.5
6,61,4.1
This isn't probably the best solution but it works! We can read the CSV file, control the distance of each row and save it in a new file.
Hope it will help!