While running a standalone java application using .bat file , sometimes if the input file has large number of records , dump files getting created and thus filling up the space in the drive.
I am trying to increase heap memory in eclipse ,but while converting the application into a runnable jar file ,it is not taking the vm argument such as -Xmax 2048,-Xmas1024 etc. so the heap memory is not able to increase while running the jar file from a .bat file
java -Xmx2048m -Xms2048m
-version
set "curpath=%cd%"
start jdk file path\jdk\jdk\bin\javaw.exe -jar "Jarfile path /filename.jar"
java -version
pause
Above is the line i wrote in the .bat file to run the jar file
According to me it should increase the heap memory and the the application should not hang and dump file should not be created .But I am still getting those dump file and these are filling up my drive space.
If anyone knows how to resolve this please help me.
Related
I want to create a batch file that will first send me to the download page, and then run the install of multiple programs sequentially. I know I can use this kind of code to start setups with accurately known names:
start https://www.7-zip.org/download.html
timeout /t 5
start DownloadPage1
pause
start 7z2106-x64.exe
timeout /5
start program1setup.exe
(BAT will be in the download folder, where setups will go; setups will be downloaded manually)
But it'll only work with exact file names, I want it to work with any version of 7z as an example, a file that starts with "7z" and ends with "x64" with unknown middle symbols, as a version.
start 7z?x64.exe
I'm open to any suggestion of improving this basic code, cuz I'm an even more basic "programmer".
Thanks.
When running a build on a newly installed Jenkins server on centos7 The Xvfb fails with the following:
Xvfb starting$ /bin/Xvfb -displayfd 2 -screen 0 1024x768x8 -fbdir
/var/lib/jenkins/xvfb-272-..fbdir6345857630426455925 FATAL: Cannot run
program "/bin/Xvfb": error=2, No such file or directory
java.io.IOException: error=2, No such file or directory
Compared to our original server, the name of the fmdir is different. I believe it should be in the format /var/lib/jenkins/xvfb-<build no.>-<unique number>.fmdir
How is the -fbdir name being generated and what creates the directory?
This is on Jenkins
2.176.1 with Xvfb plugin 1.1.3 on Centos 7.6.1810
Xvfb starting$ /usr/bin/Xvfb -displayfd 2 -screen 0
1920x1080x24 -fbdir /var/lib/jenkins/xvfb-
214-..fbdir17701667040157463918
How is the -fbdir name being generated?
It's unique & random and the format is
/var/lib/jenkins/xvfb-<build no.>-..fbdir<random unique no.>
what creates the directory?
When job is executed, Jenkins check the Directory in which to find Xvfb executable i.e./usr/bin and it creates it's copy for it's use ( at location where Jenkins is installed i.e./var/lib/jenkins/xvfb-214-..fbdir17701667040157463918) where the memory mapped files containing the framebuffer memory should be created to perform all GUI operations in virtual memory.
-fbdir framebuffer-directory
This option specifies the directory in which the memory mapped files containing the framebuffer memory should be created.This option only exists on machines that have the mmap and msync system calls.
Refer:https://docs.oracle.com/cd/E86824_01/html/E54763/xvfb-1.html
To run tests in Jenkins I use the next batch command:
"C:\Program Files (x86)\NUnit 2.6.4\bin\nunit-console.exe" /result:TestResult.xml "C:\Users\Denis\Documents\Visual Studio 2013\Projects\MyProject\App.nunit"
Here how it looks in Jenkis:
The problem is: it doesn't generate TestResult.xml file!
When I run the same command as a .bat file from my desctop it creates the TestResult.xml file.
Any ideas what is wrong with Jenkins?
P.S. I searched for created file in all possible folders and even via "search"
I came up with thought that the file is generated and removed then or something like this.
The thing is that I don't see this file indeed but Jenkins generated report based on this file! So I think maybe after generating report the file was removed automatically.
I had faced the same issue and I found the test results xml file under my user directory in Win 7.
Note : I think it's some problem with nunit that it doesn't export the file to the location of which we provide the path.
My java eclipse hadoop map reduce program is displaying an error unable to locate the input file. I had copied the files to hadoop directory via terminal using hadoop commands. I can see the files in java eclipse dfs location. And also using the command hadoop dfs -ls in terminal. When i created a normal folder (not hdfs) then the problem get solved. But then program is accesing the file from local file system.
I had installed hadoop 1.2.1 on redhat server 32 bit, using java eclipse luna, i had already included hadoop plugins and external jar files from the hadoop library. Input and output path are given through run time arguments
First of all,Hadoop eclipse plugins doesn't have great reliability. I had the same problem when using the plugin with the Eclipse Luna. But that compatibility issue got solved when i used Eclipse Juno. And there is no suitable plugin available for Hadoop 2.x versions.
You can use the tool Maven to manage all the Hadoop dependencies just like Hadoop eclipse plugin except that you should run the Job from the terminal.
Link on how to use Maven with Hadoop
Accept my answer if it fits your case. :)
I have this error message:
Preparing to deploy: Created staging directory at:
'C:\Users\leet\AppData\Local\Temp\appcfg4768292050846213939.tmp'
Scanning for jsp files. Compiling jsp files. Scanning files on
local disk. java.io.IOException: Jar
C:\Users\leet\AppData\Local\Temp\appcfg4768292050846213939.tmp\WEB-INF\lib\appengine-api-1.0-sdk-1.7.7.jar
is too large. Consider using --enable_jar_splitting.
I issued the command like this, but it does not work with --enable_jar_splitting.
"C:\Program Files\Java\jdk1.7.0_17\bin\java.exe" -Xmx1100m -cp
"%~dp0..\lib\appengine-tools-api.jar"
com.google.appengine.tools.admin.AppCfg --enable_jar_splitting -e
user#domain.com update "C:\myfolder\myproject\war"
Any comment?
The Java App Engine 1.7.7.1 SDK has been released to address this windows specific issue.
The Google Eclipse plugin has been updated, as well as the Google App Engine Maven artifacts and plugin (just use the 1.7.7.1 version).
to solve the library error message, you have to do this:
1) open your windows explorer and locate it to your eclipse folder. e.g. ".\eclipse\plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\user".
2) you will then see a file called "appengine-api-1.0-sdk-1.7.7.jar", rename it to "appengine-api-1.0-sdk-1.7.7.original". (just don't delete as you need in future)
3) copy that 2 files you created earlier - "appengine-api-1.0-sdk-1.7.7-1.jar" and "appengine-api-1.0-sdk-1.7.7-2.jar" and paste into this folder.
4) switch it eclipse ide, clean the project and rebuild it. then, the error message will go away.
i solved the issue by splitting the "appengine-api-1.0-sdk-1.7.7.jar" file my own.
in case anyone else want to know how to do that, follow these steps
1) unzip "appengine-api-1.0-sdk-1.7.7.jar" file from 7z.
2) balance them into 2 folders (each about 15mb) regardless any structure.
3) name the first folder as "appengine-api-1.0-sdk-1.7.7-1" and second folder as "appengine-api-1.0-sdk-1.7.7-2".
4) make sure you have jdk installed. e.g. "C:\Program Files\Java\jdk1.7.0_17\bin". set it to environment so you can run the file from that bin folder.
5) IMPORTANT: you must go into that first "appengine-api-1.0-sdk-1.7.7-1" folder and not at the parent folder of those folders.
6) launch cmd.exe and type "jar cf appengine-api-1.0-sdk-1.7.7-1.jar *" for the first archive.
7) do it again the same for the second archive (repeat step 5 and step 6).
8) go to \war\web-inf\libs folder, delete the existing appengine-api-1.0-sdk-1.7.7.jar.
9) copy and paste the appengine-api-1.0-sdk-1.7.7-1.jar and appengine-api-1.0-sdk-1.7.7-2.jar into \war\web-inf\libs folder.
10) now deploy it. it should work like charms!
EDIT:
Spelling correction.
Using that instruction :
To clarify, we're going to release a minor update for 1.7.7. For the
meantime, you can re-jar the file as follows:
cd to the working directory
$ jar xf somewhere\appengine-java-sdk-1.7.7\lib\user\
appengine-api-1.0-sdk-1.7.7.**jar
$ jar cfm somewhere\appengine-api-1.0-sdk-1.7.7.**jar META-INF/MANIFEST.MF *
and replace the old jar with the newly created one.
from http://www.mail-archive.com/google-appengine#googlegroups.com/msg67954.html
and the messages from the solutions here, I was able to make it work like this :
Open a command line and go into the bin directory of your JAVA installation where the jar.exe file is
cd "C:\Program Files\Java\jdk1.7.0_17\bin\"
Then, you need to find the file "appengine-api-1.0-sdk-1.7.7.jar" somewhere on your computer. It's at 2 places (not counting the temp directories), in the \war\WEB-INF\lib folder in your eclipse project and also in the "plugins" folder of your eclipse installation. Precisely there : \plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\
You just need one of those 2 paths.
Now in the command line, just type :
jar xf "C:\whatever-folder-your-eclipse-is-in\plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\user\appengine-api-1.0-sdk-1.7.7.jar"
and then
jar cfm "C:\whatever-folder-your-eclipse-is-in\plugins\com.google.appengine.eclipse.sdkbundle_1.7.7\appengine-java-sdk-1.7.7\lib\user\appengine-api-1.0-sdk-1.7.7.jar" META-INF/MANIFEST.MF
Now, if you go to that folder and check the .jar file, it should now be 11 mb instead of 30 something. Now you need to copy this one and replace the same jar in your webapp folder in \war\WEB-INF\lib\ so that both jars named "appengine-api-1.0-sdk-1.7.7.jar" have a 11 mb size.
Now the error should be gone and you don't have to split anything.