Limiting # of Lines in a Log/Buffer File - c

I've got a small program running on an OpenWRT router that logs to a remote MySQL database. In the event that the database becomes unavailable the program writes to a buffer file (/var/buffer) to protect against data loss. Thing is, since it's being stored on the router itself, there's a chance of running out of room fairly quickly if the database is down for too long.
I figure that if I keep the file to a max of 20,000 lines, discarding the oldest ones as new ones are written (once the max size has been reached), I can minimize my data losses and not have to worry about running out of storage space (a bit of loss isn't the end of the world, and I'd rather keep the newest stuff than the oldest).
From my research I understand that the first line of a file cannot be removed without rewriting the entire file (no good, too time-consuming), and every time I think I'm close to another solution it falls apart.
Is there a better way? Or is re-writing the 20k-line file every time I have a new line to add my only option?

You can have a log_LastLineNo variable which will store line number of last line written in log at that instant (For the first time, at the very start it will be 0).
Keep writing to file until you write 20,000 lines, and keep updating log_LastLineNo.
After that start overwriting the file from start, and set a variable log_full = 1.
Now
Case 1: log_full = 0 & log_LastLineNo = [some value < 20000]
In this case read from start till log_LastLineNo
Case 2: log_full = 1 & log_LastLineNo = [some value < 20000]
In this case start reading from log_LastLineNo + 1 till line 20000 and again from start to log_LastLineNo.

Related

Is it possible to scan a file in reverse from the last line in C?

I am running a simulation and want to add an option to continue evolving from the last iteration of a previous run. In order to do so, I need to read the last 2 lines of data from a file. Is there any way to do this without using fscanf to scan from the beginning of the file?
Have the previous run record in another file the ftell() values of the last few lines and other info to note the meta data of the file (e.g. date-time-modified).
A subsequent run can use this info to begin where the prior run left off.
If this side file is missing or does not agree with the current state of things, walk the files with fgetc(), fgets(), etc. to find where to begin again.

Reading data from a continuously updating log file

I have a log file (similar to a web server's access log) which I need to continuously read and use a regular expression to fetch a value in each line.
For example, if you could imagine reading from a web server's access log, I would be getting the IP address of each visitor as their visit is written to the log.
Of course this is easy to do using the linux command line (combination of tail and sed, or something like that), but I want to do it using C code.
I guess I could open the file, read X lines, save the number of the last line I read, and then on the next round open the file, move to the line X, and read from there, and so on, but this seems very clumsy.
Is there a known way or best practice for reading data from a continuously updating log file?
Thanks

Write to beginning of file in Scheme

I want to create a log file in Scheme, but every time I add a new entry, I want it to be at the beginning of the file, so when I read X number of logs from the file again, it reads the X newest entries from new to old.
Example:
22/02/14 13:50 Newest log entry
22/02/14 13:45 Older log entry
22/02/14 13:40 Oldest log entry
Does anyone know how to do this using the 'open-input-file' and 'open-output-file' procedures?
The functionality you are requesting you need to write the whole logfile every time you need to write a new entry because you will overwrite the previous first entry with the next. Usually programs don't keep the commited parts of a logfile so this introduces more memory usage and your program must know when the log is being rotated to clear the buffer.
The standard way is to append a new entry, which leaves the previous log entries where the last log write put them.
As a compromise you might look for a program that displays a log file in the reverse order and prehaps tails it like that too. It's easy to implement so I guess it would exist already. Writing such an app if it doesn't exist would be trivial.

Write to and replace a file multiple times in Fortran

I'm trying to run a code that takes a particularly long time. In order for it to complete, I've separated the time step loops as such so that the data can be dumped and then re-read for the next loop:
do 10 n1 = 1, 10
OPEN(unit=11,file='Temperature', status='replace')
if (n1.eq.1) then
(set initial conditions)
elseif (n1.gt.1) then
READ(11,*) (reads the T values from 11)
endif
do 20 n = 1, 10000
(all the calculations for new T values)
WRITE(11,*) (overwrites the T values in 11 - the file isn't empty to begin with)
20 continue
10 continue
My issue then is that this only works for 2 time n1 time steps - after it has replace file 11 once, it no longer replaces and just reiterates the values in there.
Is there something wrong with the open statement? Is there a way to be able to replace file 11 more than once in the same code?
Your program will execute the open statement 10 times, each time with status = 'replace'. On the first go round presumably the file does not exist so the open statement causes the creation of a new, empty, file. On the second go round the file does exist so the open statement causes the file to be deleted and a new, empty, file of the same name to be created. Any attempt to read from that file is likely to cause issues.
I would lift the initial file opening out of the loop and restructure the code along these lines:
open(unit=11,file='Temperature', status='replace')
(set initial conditions)
(write first data set into file)
do n1 = 2, 10
rewind(11)
read(11,*) (reads the T values from 11)
! do stuff
close(11) ! Not strictly necessary but aids comprehension of intent
! Now re-open the file and replace it
open(unit=11,file='Temperature', status='replace')
do n = 1, 10000
(all the calculations for new T values)
write(11,*) (overwrites the T values in 11 - the file isn't empty to begin with)
end do
end do
but there is any number of other ways to restructure the code; choose one that suits you.
In passing, passing data from one iteration to the next by writing/reading a file is likely to be very slow, I'd only use it for checkpointing to support restarting a failed execution.

C Remove the first line from a text file without rewriting file

I've got a service which runs all the time and also keeps a log file. It basically adds new lines to the log file every few seconds. I'm written a small file which reads these lines and then parses them to various actions. The question I have is how can I delete the lines which I have already parsed from the log file without disrupting the writing of the log file by the service?
Usually when I need to delete a line in a file then I open the original one and a temporary one and then I just write all the lines to the temp file except the original which I want to delete. Obviously this method will not word here.
So how do I go about deleting them ?
In most commonly used file systems you can't delete a line from the beginning of a file without rewriting the entire file. I'd suggest instead of one large file, use lots of small files and rotate them for example once per day. The old files are deleted when you no longer need them.
Can't be done, unfortunately, without rewriting the file, either in-place or as a separate file.
One thing you may want to look at is to maintain a pointer in another file, specifying the position of the first unprocessed line.
Then your process simply opens the file and seeks to that location, processes some lines, then updates the pointer.
You'll still need to roll over the files at some point lest they continue to grow forever.
I'm not sure, but I'm thinking in this way:
New Line is a char, so you must delete chars for that line + New Line char
By the way, "moving" all characters back (to overwrite the old line), is like copying each character in a different position, and removing them from their old position
So no, I don't think you can just delete a line, you should rewrite all the file.
You can't, that just isn't how files work.
It sounds like you need some sort of message logging service / library that your program could connect to in order to log messages, which could then hide the underlying details of file opening / closing etc.
If each log line has a unique identifier (or even just line number), you could simply store in your log-parsing the identifier until which you got parsing. That way you don't have to change anything in the log file.
If the log file then starts to get too big, you could switch to a new one each day (for example).

Resources