I tried to get a script to create a text file that could write/add the images name, but the function
FileID = CreateFileForWriting(filename) does not work, it shows that was used by other process
I did not get this, is this function not right format or something is wrong, thx
Number Totaln
totaln=countdocumentwindowsoftype(5)
String filename, text
Number fileID
if (!SaveasDialog( "save text file as",getapplicationdirectory(2,0) + "Imagename.txt", filename))exit(0)
fileID = CreateFileForWriting(filename)
number i
for(i = 0; i <totaln; i++)
{
image imgSRC
imgSRC := GetFrontImage()
string imgname=getname(imgSRC)
WriteFile(fileID,"imgname")
Result("imgname")
}
Your code is nearly fine, but if you use the low-level API for file I/O you need to ensure that you close files you've opened or created.
Your script doesn't. Therefore, it runs fine exactly 1 time but will fail on re-run (when the file is still considered open.)
To fix it, you need to have closefile(fileID) at the end.
( BTW, if you script exits or throws after opening a file but before closing it, you have the same problem. )
However, I would strongly recommend not using the low-level API but the file streaming object instead. It also provides an automated file-closing mechanism so that you don't run into this issue.
Doing what you do in your script would be written as:
void writeCurrentImageNamesToText()
{
number nDoc = CountImageDocuments()
string filename
if (!SaveasDialog( "save text file as",getapplicationdirectory(2,0) + "Imagename.txt", filename)) return
number fileID = CreateFileForWriting(filename)
object fStream = NewStreamFromFileReference(fileID,1) // 1 for auto-close file when out of scope
for( number i = 0; i <nDoc; i++ ){
string name = GetImageDocument(i).ImageDocumentGetName()
fStream.StreamWriteAsText( 0, name + "\n" ) // 0 = use system encoding for text
}
}
writeCurrentImageNamesToText()
Related
I have tried doing this by encrypting individual files but I have a lot of data (~20GB) and hence it would take a lot of time. In my test it took 2.28 minutes to encrypt a single file of size 80MB.
Is there a quicker way to be able to password protect that would apply to any any file (text/binary/multimedia)?
If you are just trying to hide the file from others, you can try to encrypt the file path instead of encrypting the whole huge file.
For the path you mentioned: text/binary/multimedia, you can try to encrypt it by a method as:
private static String getEncryptedPath(String filePath) {
String[] tokens = filePath.split("/");
List<String> tList = new ArrayList<>();
for (int i = 0; i < tokens.length; i++) {
tList.add(Hashing.md5().newHasher() // com.google.common.hash.Hashing;
.putString(tokens[i] + filePath, StandardCharsets.UTF_8).hash().toString()
.substring(2 * i, 2 * i + 5)); // to make it impossible to encrypt, add your custom secret here;
}
return String.join("/", tList);
}
and then it becomes an encrypted path as:
72b12/9cbb3/4a5f3
Once you know the real path text/binary/multimedia, any time you want to access the file, you can just use this method to get the real file path 72b12/9cbb3/4a5f3.
I am new to using imageJ and creating my own codes, anyways, I am trying to create a loop that runs all the roi's at once, but I am having trouble doing that. So far this is the code I have:
input = "S:\\Research Projects\\BAC\\machine training set\\Results_1stRound\\2016Data_1stRound\\epoch_based_training_0.7_TPF=0.615_FP=2.110\\SID130871_9999.330357336093230241152104825447607218951\\";
output = input;
function action(input, output, filename) {
open(input + filename);
setThreshold(112, 255);
run("Create Selection");
roiManager("Add");
roiManager("Select", 0);
saveAs("selection", output + filename);
close();
roiManager("Deselect");
roiManager("Delete");
}
list = getFileList(input);
for (i = 0; i < list.length; i++)
action(input, output, list[i]);
What I want the loop to do is to look through all the different SID files that I have so I wouldn't need the specific SID part in the input but I have no idea how to create a loop so that it looks through folders (SID files) and subfolders to create the rois. As of right now, I have to put the specific SID file in the input, so any help on how I can create a loop that looks through the different SID files at once and then create the rois would be great.
You can find some macro examples and examples on the ImageJ mailing list archive how to iterate over nested folders:
https://imagej.nih.gov/ij/macros/BatchProcessFolders.txt
http://imagej.1557.x6.nabble.com/batch-process-macro-td4469342.html
I am trying to modify a text file I am using PHP or also I can use the C# the file that I am working on a text file consists of strings for example
TM_len= --------------------------------------------
EMM_len --------------------------------------------
T_len=45 CTGCCTGAGCTCGTCCCCTGGATGTCCGGGTCTCCCCAGGCGG
NM_=2493 ----------------ATATAAAAAGATCTGTCTGGGGCCGAA
and I want to delete those four lines from the file if I found that one line consists of only "-" no characters in it and of course save to the file.
Maybe something like this? I wrote it in a easy to understand and "not-shortened" way:
$newfiledata = "";
$signature = " ";
$handle = fopen("inputfile.txt", "r"); // open file
if ($handle) {
while (($line = fgets($handle)) !== false) { // read line by line
$pos = strpos($line, $signature); // locate spaces in line text
if ($pos) {
$lastpart = trim(substr($line, $pos)); // get second part of text
$newstring = trim(str_replace('-', '', $line)); // remove all dashes
if (len($newstring) > 0) $newfiledata .= $line."\r\n"; // if still there is characters, append it to our variable
}
}
fclose($handle);
}
// write new file
file_put_contents("newfile.txt", $newfiledata);
thanks for your response but there nothing happened on the file please check the link of the file and another link of the desired output for the file.download the file and required output file
My purpose is to parse text files and store information in respective tables.
I have to parse around 100 folders having more that 8000 files and whole size approximately 20GB.
When I tried to store whole file contents in a string, memory out exception was thrown.
That is
using (StreamReader objStream = new StreamReader(filename))
{
string fileDetails = objStream.ReadToEnd();
}
Hence I tried one logic like
using (StreamReader objStream = new StreamReader(filename))
{
// Getting total number of lines in a file
int fileLineCount = File.ReadLines(filename).Count();
if (fileLineCount < 90000)
{
fileDetails = objStream.ReadToEnd();
fileDetails = fileDetails.Replace(Environment.NewLine, "\n");
string[] fileInfo = fileDetails.ToString().Split('\n');
//call respective method for parsing and insertion
}
else
{
while ((firstLine = objStream.ReadLine()) != null)
{
lineCount++;
fileDetails = (fileDetails != string.Empty) ? string.Concat(fileDetails, "\n", firstLine)
: string.Concat(firstLine);
if (lineCount == 90000)
{
fileDetails = fileDetails.Replace(Environment.NewLine, "\n");
string[] fileInfo = fileDetails.ToString().Split('\n');
lineCount = 0;
//call respective method for parsing and insertion
}
}
//when content is 90057, to parse 57
if (lineCount < 90000 )
{
string[] fileInfo = fileDetails.ToString().Split('\n');
lineCount = 0;
//call respective method for parsing and insertion
}
}
}
Here 90,000 is the bulk size which is safe to process without giving out of memory exception for my case.
Still the process is taking more than 2 days for completion. I observed this is because of reading line by line.
Is there any better approach to handle this ?
Thanks in Advance :)
You can use a profiler to detect what sucks your performance. In this case it's obvious: disk access and string concatenation.
Do not read a file more than once. Let's take a look at your code. First of all, the line int fileLineCount = File.ReadLines(filename).Count(); means you read the whole file and discard what you've read. That's bad. Throw away your if (fileLineCount < 90000) and keep only else.
It almost doesn't matter if you read line-by-line in consecutive order or the whole file because reading is buffered in any case.
Avoid string concatenation, especially for long strings.
fileDetails = fileDetails.Replace(Environment.NewLine, "\n");
string[] fileInfo = fileDetails.ToString().Split('\n');
It's really bad. You read the file line-by-line, why do you do this replacement/split? File.ReadLines() gives you a collection of all lines. Just pass it to your parsing routine.
If you'll do this properly I expect significant speedup. It can be optimized further by reading files in a separate thread while processing them in the main. But this is another story.
I want to read data from a file and save it into an array. Then insert some new data into this array and then save this new data back into the same file deleting what is already there. My code works perfectly, giving me my required data, when I have 'r+' in the fopen parameters, however when I write to the file again it does not delete the data already in the file just appends it to the end as expected. However when I change the permissions to 'w+' instead of 'r+', my code runs but no data is read in or wrote to the file! Anyone know why this might be the case? My code is as seen below.
N = 1021;
b = [0;0;0;0;0];
% Opens file specified by user.
fid = fopen('testing','w+');
% Read header data
Header = fread(fid, 140);
% Move to start of data
fseek(fid,140,'bof');
% Read from end of config header to end of file and save it in an array
% called data
Data = fread(fid,inf);
Data=reshape(Data,N,[]);
b=repmat(b,[1 size(Data,2)]);
r=[b ; Data];
r=r(:);
r = [Header;r];
% write new values into file
fwrite(fid,r);
fclose(fid);
% Opens file specified by user.
fid = fopen('test');
All = fread(fid,inf);
fclose(fid);
According to the documentation, the w+ option allows you to "Open or create new file for reading and writing. Discard existing contents, if any." The contents of the file are discarded, so Data and Header are empty.
You need to set the position indicator of the filehandle before writing. With frewind(fid) you can set it to the beginning of the file, otherwise the file is written / appended at the current position.
N = 1021;
b = [0;0;0;0;0];
% Opens file specified by user.
fid = fopen('testing','r+');
% Read header data
Header = fread(fid, 140);
% Move to start of data
fseek(fid,140,'bof');
% Read from end of config header to end of file and save it in an array
% called data
Data = fread(fid,inf);
Data=reshape(Data,N,[]);
b=repmat(b,[1 size(Data,2)]);
r=[b ; Data];
r=r(:);
r = [Header;r];
% write new values into file
frewind(fid);
fwrite(fid,r);
fclose(fid);
% Opens file specified by user.
fid = fopen('test');
All = fread(fid,inf);
fclose(fid);