I am using solr 5.2.1 on ubuntu server, I have created some cores in it and added data to it.
After some time due to some error I restarted solr service using following command
service solr restart
After this command my data directory got suddenly changed from
"/opt/solr/server/solr"
to
"/var/solr/data"
on dashboard it is showing as Dsolr.solr.home=/var/solr/data
Now how can I change this home data path, what is process or command to change this path
command to change home path
solr start -s /opt/solr/server/solr
or if you are not in bin page than also specify its full path.
bin/solr start -s /opt/solr/server/solr
Related
How can I configure the solr home path?
It works for me if I using the -s option in the start command:
bin/solr start -s /opt/solr/server/xy
But I want that solr use the home folder "/opt/solr/server/xy" also when I start solr WITHOUT the -s option like this:
sudo service solr start
I had configure in bin/solr.in.sh -> SOLR_HOME=/opt/xy/apps/solr/server/xy but when I restart (stop and start) solr the admin panel shows me: solr.solr.home /var/solr/data - which is wrong.
I also try to change the SOLR_HOME entry in the /etc/default/solr.in.sh file but that didn't work either
In the default file there were 2 entries that were overwritten.
It is the case that the "SOLR_HOME entry" in the "/etc/default/solr.in.sh" file has to be overwritten.
I am using the cloned dspace 6-x branch and installed it via docker. Can someone help me with the backup of my local database (Communities, collections, items)to a remote database?
According to the documentation we need to use the command:
dspace packager -s -t AIP -e eperson -p parent-handle file-path
But it returns an error: dspace is not a command
Anyone could help me transfer my local database to my remote repo?
Thanks!
Moving publications to a new repository will be a more substantial undertaking!
But your recent problem seems just that you are either not on the right container or in the right directory for executing the dspace command. Thus it is "not found". Make sure to execute dspace on the dspace container and specify the right/complete path. The dspace command is located in
/path/to/your/dspace-deployement-directory/bin.
I can successfully run a gsutil command with a windows domain account from the command line in Windows (setting up service account key etc.). When I try to run the same command from a SQL Agent Job using a CmdExec task the job hangs and doesn't complete. I can't see any logging so have no clue what it's waiting for. I've setup the job to run with the same Proxy User that i use to run the gsutil command manually.
Any ideas how I can get this to work or how to see more logging?
Are you using standalone gsutil? Or did you get it as part of installing the Cloud SDK (gcloud)?
If the job hangs for a long time, it could be stuck retrying multiple times. To test if this is the case, you can set the num_retries option to be very small, but above 0 (e.g. 1) either in your .boto file or the the command arguments via this option:
gsutil -o 'Boto:num_retries=1' <rest of command here...>
A second thing to note (at least for the version of gsutil that doesn't come with gcloud) is that gsutil looks for your boto config file (which specifies the credentials it should use) in your home directory by default. If you're running gsutil as a different user (maybe your SQL Agent Job runs as its own dedicated user?), it will look for a .boto file in that user's home directory. The same should apply for the gcloud version -- gcloud uses credentials based on the user executing it. You can avoid this by copying your .boto file to somewhere that the job has permission to read from, along with setting the BOTO_CONFIG environment variable to that path before running gsutil. From the cmd shell, this would look something like:
set BOTO_CONFIG=C:\some\path\.boto && gsutil <rest of command here...>
Note: If you're not sure which boto config file you're normally using, you can find out by running gsutil version -l and looking at the line that displays your config path(s).
I have solr installed on Debian and every time delta import takes place a file gets created in my root directory.
The files look like this
dataimport?command=delta-import.1
dataimport?command=delta-import.2
.
.
.
dataimport?command=delta-import.30
Every time there is a delta import a file gets created , i opened the file in vi editor and its an xml file.
Why are these files getting created and how do i stop solr from creating them.
To start solr i use this command
Java -jar start.jar &
According to this command no log files should be created.
Please advise and help iam new to solr
These files are generated by the command invoking the delta-import. From the file name it seems you have a cronjob running that does a wget .. on the URL, which results in wget attempting to create a file named dataimport?command=delta-import in the directory where it's running. When that file exists, it appends .1 and up.
Check which command you run from cron (crontab -e) and if you're using wget, append -O /dev/null to your command to make wget discard the response from the server.
I have tried every possible solution to change the document root on ubuntu to my new site but nothing works. I have changed the setting in available sites default file (both document root and directory root) to my new directory housing the website or creating a new available sites file pointing to my web directory and switching the site through a2ensite and a2dissite commands. But still it goes to the same default page
I already fixed that :
Install webmin
From webmin goto servers->Apache webserver
Edit the default "virtual host" :
handle all connections
add your doc-root
use httpd.conf file ..
Once done do a server restart from webmin or the command line:
sudo /etc/init.d/apache2 restart