PyCharm File size exceeds configured limit (2,56 MB), code insight features not available - file

I'm working with some big txt files, (some around 3 GB), and whenever I have to check the txt files the message "File size exceeds configured limit (2.56 MB), code insight features not available" appear in the top of the file, I tried to change the file size by going to Help->Edit custom properties and then adding the next line of code in the file that opens
idea.max.content.load.filesize=500000
the problem is that even after closing and re-opening PyCharm the same message appears, do I need to do something else? just writing that line of code is enough to change the filesize?, it doesn't need to be run like a normal code? if so how can I run it since the option doesn't appear?

instead of using the original line of code I used
idea.max.intellisense.filesize = new size in kB
also, I advise rebooting the Pc after adding that line of code in the window that appears after going to Help->Edit custom properties

Related

identifying data file type

I have a huge 1.9 GB data file without extension I need to open and get some data from, the problem is this data file is extension-less and I need to know what extension it should be and what software I can open it with to view the data in a table.
here is the picture :
Its only 2 lines file, I already tried csv on excel but it did not work, any help ?
I have never use it but you could try this:
http://mark0.net/soft-tridnet-e.html
explained here:
http://www.labnol.org/software/unknown-file-extensions/20568/
The third "column" of that line looks 99% chance to be from php's print_r function (with newlines imploded to be able to stored on a single line).
There may not be a "format" or program to open it with if its just some app's custom debug/output log.
A quick google found a few programs to split large files into smaller units. THat may make it easier to load into something (may or may not be n++) for reading.
It shouldnt be too hard to mash out a script to read the lines and reconstitute the session "array" into a more readable format (read: vertical, not inline), but it would to be a one-off custom job, since noone other than the holder of your file would have a use for it.

What are possible reasons that cmd stop writing to a file with redirection?

This is on win7.
I got a batch script that executes a C++ program and take all of its output to the file with ">".
The program takes input from servers and display everything. We need all these information so we log all these outputs down to a file. But after a short while, we see that the program stops writing to the file and just stop there while the program continues running.
The file size is also at 0 byte (OS doesn't update until file is closed?) But we can see the content of the file with notepad++, but it does not seem to update any longer.
There are about 250,000 lines long and we see that our data simply got cut off in the end. For example, suppose you should have a table of data that lists out 123 567 436 975, we only see 123 567 43. The whole line isn't even finished in the end.
There are a lot of things to write down and there are lots of network transmission. Does the program simply give up outputting when there are too much data? Is there a way around this?
Try to disable buffering. setbuf(stdout, NULL);.
Anyway, in new versions of windows, when a file is being created and data is being written (the clasic >file scenario), the grow of the file is not always visible.
In this case, dir command shows a 0 bytes file, or stops to show increasing values.
Try to read the file with type file >nul and then dir file. This "should" refresh the file size information. But it is not needed. The file is growing, just not showing it.

Writing to file error: "Too many files open"

I want to write the array cloud which is nothing abut a array storing the coordinates of a circular cloud with two columns, of latitude and longitude. I want these coordinates to be written on a text file in a manner like this.
418.9517 43.9866
419.2260 44.1501
419.4826 44.3402
419.7190 44.5550
419.9327 44.7923
420.1217 45.0497
With this code i want to generate multiple no of such files storing the coordinates of a single cloud in one file.
Here a is array with first two columns of latitude and longitude (center of circle) and the third one radius of the circle. And z =size(a).(which is 2905x3). So that makes a total of 2905 files to be written.
for s =1:z(1)
r= a(s,3);
ang=0:0.1:2*pi;
xp=a(s,1) + r*cos(ang);
yp=a(s,2) + r*sin(ang);
xp=xp';
yp= yp';
cloud = [xp,yp]
filename = ['Shower_Cloud',s,'number.txt']
file_id = fopen (filename,'w');
fprintf(file_id,'%g\t',cloud[]);
fclose(file_id);
end
The error when i run the code is the main problem i'm not able to diagnose this problem on my own, although i have a feeling its a minor one.
>> xyz
D:\Users\Vikram\Documents\MATLAB\Manuela\Version_2\Weather\Shower\xyz.m:
Too many files open; check that FILES = 20 in
your CONFIG.SYS file.
Unexpected error status flag encountered. Resetting to proper state.
Please ask if i missed on something important to mention.
This is just a guess but one could expect strange behavior when concatenating numbers with strings.
You may want to use num2str(s) in creating the file name.
Maybe other parts of your program lose track of opened files. Use fopen('all') to list filehandles of open files. This is maybe a starting point for hunting down the bug.
Most likely is that some bug at some point in your code caused many files to be opened without being closed. Note that even though the code you posted does indeed close every file correctly, if you are still running the same MATLAB session you may still have files open.
You can close all currently open files like so:
fclose all
So I suggest you type that into the MATLAB prompt first. If you are still having the error, having a look at:
fopen all
which lists all currently open files; hopefully this will give you enough information to find the problem.

Managing log file size

I have a program which logs its activity.
I want to implement a log file mechanism to keep the log file under a certain size, lets say 10 MB.
The log file itself just holds commands the program executed; those commands are variable length.
Right now, the program runs on a windows environment, but I'm likely to port it to UNIX soon.
I've came up with two methods for managing the log files:
1. Keep multiple files of lower size, and if the new command exceeds the current file length, truncate the oldest file to zero size, and start writing there.
2. Keep a header in the file, which holds metadata regarding the first command in the file, and the next place to write to in the file. Also I think, each command should hold metadata about it's length this way.
My questions are as follows:
In terms of efficiency which of these methods would you use, and why?
Is there a unix command / function to this easily?
Thanks a lot for your help,
Nihil.
On UNIX/Linux platforms there's a logrotate program that manages logfiles. Details can be found for example here:
http://linuxcommand.org/man_pages/logrotate8.html

A simple log file format

I'm not sure if it was asked, but I couldn't find anything like this.
My program uses a simple .txt file for log purposes, It just creates/opens a file and appends lines.
After some time, I started to log quite a lot of activities, so the file became too large and hardly readable. I know, that it's not write way to do this, but I simply need to have a readable file.
So I thought maybe there's a simple file format for log files and a soft to view it or if you'd have any other suggestions on this question?
Thanks for help in advance.
UPDATE:
It's access 97 application. I'm logging some activities like Form Loading, SELECT/INSERT/UPDATE to MS SQL Server ...
The log file isn't really big, I just write the duration of operations, so I need a simple way to do this.
The Log file is created on a user's machine. It's used for monitoring purposes logging some activities' durations.
Is there a way of viewing that kind of simple Log file highlighted with an existing tool?
Simply, I'd like to:
1) Write smth like "'CurrentTime' 'ActivityName' 'Duration in milliseconds' " (maybe some additional information like query string) into a file.
2) Open it with a tool and view it highlighted or somehow more readable.
ANSWER: I've found a nice tool to do all I've wanted. Check my answer.
LogExpert
The 3 W's :
When, what and where.
For viewing something like multitail ("tail on steroids") http://www.vanheusden.com/multitail/
or for pure ms windows try mtail http://ophilipp.free.fr/op_tail.htm
And to keep your files readable, you might want to start new files when if the filesize of the current log file is over certain limit. Example:
activity0.log (1 mb)
activity1.log (1 mb)
activity2.log (1 mb)
activity3.log (1 mb)
activity4.log (205 bytes)
A fairly standard way to deal with logging from an application into a plain text file is to:
split the logs into different program functional areas.
rotate the logs on a daily/weekly basis (i.e. split the log on a size or date basis)
so the current log is "mylog.log" or whatever, and yesterday's was "mylog.log.1" or "mylog.ddmmyyyy.log"
This keeps the size of the active log manageable. And then you can just have expiry rules so that old logs get thrown away on a regular basis.
In addition it would be a good idea to have different log levels for your application (info/warning/error/fatal) so that you're not logging more than is necessary.
First, check you're only logging things that are useful.
If it's all useful, make sure it is easily parsable by tools such as grep, that way you can find the info you want. Make sure you have the type of log entry, the date/time all conforming to a layout.
Build yourself a few scripts to extract the information for you.
Alternatively, use separate log files for different types of entries.
Basically you better just split logs according to severity. You'll rarely need to read all logs for the whole system. For example apache allows to configure error log and access log, pretty obvious what info exactly they have.
If you're under linux system grep is your best tool to search through logs for specific entries.
Look at popular logfiles like /var/log/syslog on Unix to get ideas:
MMM DD HH:MM:SS hostname process[pid]: message
Example out of my syslog:
May 11 12:58:39 raphaelm anacron[1086]: Normal exit (1 job run)
But to give you the perfect answer we'd need more information about what you are logging, how much and how you want to read the logs.
If only the size of the log file is the problem, I recommend using logrotate or something similar. logrotate watches log files and, depending on how you configured it, after a given time or when the log file exceeds a given size, it moves the log file to an archive directory and optionally compresses it. Then the original log file is truncated. For example, you could configure it to archive the log file every 24 hours or whenever the files size exceeds 500kb.
If this is a program, you might investigate apache logging libraries (http://logging.apache.org/) Out of the box, they'll give you a decent logging format out of the box. They're also customizable, so you can simplify your parsing job.
If this is a script, see some of the other answers.
LogExpert
I've found it here. Filter is better, than in mtail. There's an option of highlighting just adding a string and the app is nice and readable. You can customize columns as you like.

Resources