Vscode Server trying to read high sized files even if file closed - file

I am using visual studio code server to work on my server. Since I have a low connection speed (approximately 50 Mbps(4-5Mb/s)), I am struggling with opening high size file. In vscode server if you click to open a file, it is reading all binaries of a file and trying to send these binaries to client side but here is the problem, since my file reading speed is depend on my network speed, it takes minutes to read a high size file. Even if I click to close button of that corresponding file, vscode trying to download completely this file and make my vscode unusable.
Idle situation
When trying to open file
File closed
Vscode should stop download file, when user closed file.

Related

Forcing an open file closed in (ANSI) C

I have a SCADA system (Siemens WinCC Classic 7.5) where there is data exchange between a storage facility and my system, based on text files.
Sometimes (very rarely, maybe 1 in 2000) it crashes the connection to the network drives where the files are exchanged.
The only way to fully recover is to restart the server (Windows 2019).
I suspect that what happens is, that one file is reopened by the SCADA program, before it is actually closed again, because the file is processed cyclically every 1 seconds.
Closing of the file is implemented (with error handling as well) and works during normal operation.
However, if the file is opened and not closed by the same function, I lack a way to "forcefully" close it.
Does anyone know of the golden solution to finding and closing/terminating open files without restarting the entire server?
I use fopen() to open the file, and it's normally closed with fclose().
This works fine normally.
But if the file is opened with fopen() and not closed in the same function, the file remains open and cannot be renamed/deleted without restarting.
I hope the above makes sense, because it's a pretty complex system so it's difficult to summarize in such short terms.
I've searched far and wide and not been able to find a suitable solution.
This is made even more difficult by being locked by only having the Siemens-enabled C-functions.

stdlib rename api opening new handle when trying to write over network to mounted location

I have a folder present in windows , which i am mounting from mac side using smb.
Into this mounted location i'm trying to write a new file over the network.
After writing the data into the file , the handle is closed.
Now i need to rename the file over network from mac side .
So for that, making use of stdlib rename(), observed that the file rename is working fine
but renaming of the file is opening a new handle which cannot be captured.
When i try to delete
the same file over network using DeleteFile api ,the file gets marked for delete , but actually
fails to delete and the permission of the files gets changed as the the file is still open(which i came
to know from computer management open files).
So stdlib rename api is opening the new handle while
renaming which prevents the file from delete. stuck with this issue and Need inputs to fix it.
Using Xcode version 7.3.1 for development.

Can create a file but not write

I have an application that writes to a log file and also creates other files as it runs. What I am seeing happening is that, in some rare cases, the application is running fine (creating and writing files and appending to the log file) and then suddenly the application can continue to create and delete files, but the application cannot write to a file. The log file stops writing in the middle of a line and other files that are created are 0 bytes because although they can be created, we could not write to them.
Rebooting the machine helps and we can create and write files with no issue after the reboot. All affected machines are running either RHEL6 or CentOS 6. This is not a drive space issue, there is tons of room left on the machines that display the issue.
Does anyone have any clue what might cause this behavior?

Too many open files error when reading from a directory

I'm using readTextFile(/path/to/dir) in order to read batches of files, do some manipulation on the lines and save them to cassandra.
Everything seemed to work just fine, until I reached more than 170 files in the directory (files are deleted after a successful run).
Now I'm receiving "IOException: Too many open files", and a quick look at lsof I see thousands of file descriptors opening once I run my code.
Almost all of the file descriptors are "Sockets".
Testing on a smaller scale with only 10 files resulted in more than 4000 file descriptors opening, once the script has finished, all the file descriptors are closed and back to normal.
Is it a normal behavior of flink? should I increase the ulimit?
Some notes:
The environment is Tomcat 7 with java 8, flink 1.1.2 using DataSet API.
Flink job is scheduled with quartz.
All those 170+ files sum to about ~5MB.
Problem solved.
After narrowing the code, I found out that using "Unirest.setTimeouts" inside a highly parallel "map()" step caused too many threads allocations which in turn consumed all my file descriptors.

VB.Net Transferring A Windows Form App Onto A USB Stick

I am just wondering how do you transfer a windows form application onto a flash drive because every time I have tried after its transferred I click to open it and an error message displays saying:
"Cannot find the file 'F:\Vending Machine.vb.'
I need to be able to transfer it so when it opens I am able to edit the code and the appearance of the application as I would if I were to just open a new windows form application now.
Thank you
Marcus.
You will need to transfer the entire project/solution to the USB.
Transfering the already compiled .exe should open the latest build of your application. Transferring only a portion of the solution folder and then trying to open the .sln will cause errors on the files you have left out.
Clear your FlashDrive
Copy the entire folder(not the contents) to your flashdrive (C:\Users\Marcus's MPC\Documents\Visual Studio 2010\Projects\Vending Machine)
Attempt to open the .sln
Based on your additional comments it seems you are not moving all of the files the solution needs. If you have files stored in other locations (like My Document) those will need to be reconfigured to work with the solution or you will have to mimic the environment.
I would recommend reconfiguring the project to consolidate any stray files you have.

Resources