Logging in admin UI does not entries after changing settings - solr

I am trying to debug a datimport and changed inside solr 7.7.1 admin UI the loglevel of dataimport from ERROR to ALL. This does not have any effect, furthermore the setting will go back to the original settings after restarting and reindexing.
How can I enable logging for dataimport INFO?

Any changes made in the web UI for logging are temporary changes and can't be persisted:
You can control the amount of logging output in Solr by using the Admin Web interface. Select the LOGGING link. Note that this page only lets you change settings in the running system and is not saved for the next run.
The easiest way to change the logging level might be to change it when starting Solr, either through the SOLR_LOG_LEVEL environment variable, or through the -v parameter to bin/solr:
bin/solr start -f -v
This will start Solr with the DEBUG log level by default.
More detailed, permanent logging configuration is done through the standard Log4j2 syntax, which can be configured in server/resources/log4j2.xml.

Related

How to make config changes take effect in Solr 7.3

We are using solr.SynonymFilterFactory with synonyms.txt in Solr during querying. I realized that there is an error in synonyms.txt, corrected it and uploaded the new file. I can see the modified synonyms.txt from Admin. But it looks like the queries are still using the old synonyms.txt. I am executing test queries from Admin with debugQuery=true and can see the synonyms getting used. How can this be fixed? It is a production environment with 3 nodes using zookeeper for management.
You'll need to reload your core for the changes to take effect.
In a single-node Solr you can do that from the Admin page: go to Core Admin, select your core, and hit Reload. This will slow down some queries but it shouldn't drop queries or connections.
You can also reload the core via the API:
curl 'http://localhost:8983/solr/admin/cores?action=RELOAD&core=your-core'
I am not sure how this works on an environment with 3 nodes, though.

Create Solr core in console with authentication

I have a Solr core that needs an authentication. Let's say I have a user with password password.
When I now try to create a Solr core in the console with
bin\solr create -c test
I get an HTTP Error 401.
Of course I need to specify my user. How can I add my user and password in the console?
Something like bin\solr create -c test -authentication user:password
Using Command Line scripts with BasicAuth
Add the following line to the (/opt/solr/bin) solr.in.sh or solr.in.cmd file. This example tells the bin/solr command line to to use "basic" as the type of authentication, and to pass credentials with the user-name "solr" and password "SolrRocks":
SOLR_AUTH_TYPE="basic"
SOLR_AUTHENTICATION_OPTS="-Dbasicauth=solr:SolrRocks"
look:
https://lucene.apache.org/solr/guide/6_6/basic-authentication-plugin.html#BasicAuthenticationPlugin-UsingBasicAuthwithSolrJ
If you use Basic Authentication than first set in the CMD Console:
set SOLR_AUTH_TYPE=basic
set SOLR_AUTHENTICATION_OPTS="-Dbasicauth=USER:PASS"
Customize USER:PASS for your need
It has been a while so, I am not sure if you still have this issue. But this can help others who may come across this like how I did.
I am new to solr, so there could be a better way of doing this than what I have here.
Are you using zookeeper for authentication?
Assuming you are using zookeeper for authentication
First of all, I checked and I don't think there is an -authentication parameter for solr create command.
So, this is how I hacked around this issue:
Stopped solr
Started solr without zookeeper
created the core like how you tried for the test core
Then uploaded configuration of the test core to zookeeper. I followed this document - https://systemsarchitect.net/2013/04/06/painless-guide-to-solr-cloud-configuration/
Restarted zookeeper server
Started solr with -p 8983 -z zookeeper_hostname:zookeeper_port
You should be able to access your test core with authentication enabled

PostgreSQL settings

I want to turn the standard_conforming_strings off in postgresql.
I enter the command "set standard_conforming_strings=off;" and then check it with "show standard_conforming_strings;" and everything is ok. When I quit and then open the client again the setting is switched back to on.
How can I permanently turn it off ?
Settings in Postgresql can be made on three levels:
As you have already discovered, you can change individual settings on a session basis using set.
You can set parameters persistently on a per-user basis using the ALTER ROLE command. This can be set for a single database or for all databases in the server.
To change settings on a global basis, persistently for all users, you can edit postgresql.conf. Later versions of postgresql also allow changing global config via the ALTER SYSTEM command. In addition, global configuration overrides can be made by adding command line options when starting the server.
Not all parameters can be changed per session or per user.
Changes to postgresql.conf will not take effect until you reload the server (pg_ctl reload, or via appropriate system init command). Some parameters require a complete restart to take effect.
This manual page describes the different methods in more detail.
In postgresql.conf you set
standard_conforming_strings=off
and load new conf with
pg_ctl reload
as in docs: http://www.postgresql.org/docs/9.2/static/config-setting.html

Read log files on JBoss AS 7

I have an application running on JBoss AS 7 and creating log files in /standalone/log.
For security reasons I not allowed to browse JBoss directories.
Is there any build-in application to read these logs files from a browser ?
NB : I cannot use admin console either.
No, nothing built in. You can have the admins configure the logging service to put logs where you can get to them, or you can configure the logger to capture logs and post to a database or other.
Not yet, but there are some requests for it (one by me, BTW ;-) and it might appear in WildFlz 8. Hopefully. (Vote on them if you like.)
WFLY-1048 Allow hooking into logging subsystem through Management API
WFLY-1144 Provide the ability to view server logs through the web interface
WFLY-280 Provide an operation to retrieve the last 10 errors from the log
Until then, I suggest to ask the admins to allow access to that one particular log file.
If that doesn't pass through, you may declare dependency of your deployment on a logging service's modules (Dependencies: ... in MANIFEST.MF) and the log manager in JVM. Unless there's some additional obstacle like security manager or so.

Download Log from AppEngine Including Python Log Statements

I know you can download the raw access logs with appcfg.py, but I'm really interested in all the information around a specific request like python logging statements, exceptions and api statistics (just like the online log viewer). Does anyone know if there is a way to get that information another way then having to build it yourself?
If case anyone is wondering, we want to do some continuos statistical analyzing for problems and displaying them on a large screen on a wall in the office.
Sure - just pass the --severity flag to appcfg.py:
$ appcfg.py help request_logs
Usage: appcfg.py [options] request_logs <directory> <output_file>
Write request logs in Apache common log format.
The 'request_logs' command exports the request logs from your application
to a file. It will write Apache common log format records ordered
chronologically. If output file is '-' stdout will be written.
Options:
-h, --help Show the help message and exit.
-q, --quiet Print errors only.
-v, --verbose Print info level logs.
--noisy Print all logs.
-s SERVER, --server=SERVER
The server to connect to.
--insecure Use HTTP when communicating with the server.
-e EMAIL, --email=EMAIL
The username to use. Will prompt if omitted.
-H HOST, --host=HOST Overrides the Host header sent with all RPCs.
--no_cookies Do not save authentication cookies to local disk.
--passin Read the login password from stdin.
-A APP_ID, --application=APP_ID
Override application from app.yaml file.
-V VERSION, --version=VERSION
Override (major) version from app.yaml file.
-n NUM_DAYS, --num_days=NUM_DAYS
Number of days worth of log data to get. The cut-off
point is midnight UTC. Use 0 to get all available
logs. Default is 1, unless --append is also given;
then the default is 0.
-a, --append Append to existing file.
--severity=SEVERITY Severity of app-level log messages to get. The range
is 0 (DEBUG) through 4 (CRITICAL). If omitted, only
request logs are returned.
--vhost=VHOST The virtual host of log messages to get. If omitted,
all log messages are returned.
--include_vhost Include virtual host in log messages.
--end_date=END_DATE End date (as YYYY-MM-DD) of period for log data.
Defaults to today.
This is what works for us really well:
appcfg.py --append --num_days=0 --include_all request_logs /path/to/your/app/ /var/log/gae/yourapp.log
Anyway, the line above will get all your log records and append them to a log file if you've executed this before, if not, it will create a new log file. It actually looks at your existing log (if it's there) and it will not get any duplicates. You can run this without --append if you want, but use it if you are automating log downloads.
The key here is the --include_allflag which seems to be undocumented. This flag will get all the data that you see if you use GAE's web log viewer. So, you will get fields such as: ms=71 cpu_ms=32 api_cpu_ms=12 cpm_usd=0.000921... etc.
OK, I hope that helps someone.
BTW, we wrote up a blog post on this, check it out here.
I seem to be running into 100M limit with appcfg. I ended up using logservice API to get the logs
Here's the code - https://github.com/manasg/gae-log-fetcher
Here is a way to access raw logs so you can further processing without custom parsing (also for me request_logs is not downloading all the data for specified time frame).
Here is an app which runs in the appengine itself:
https://gaelogapp.appspot.com/
You can easily add this functionality to your app with updates to app.yaml and copy logs.py:
https://github.com/okigan/gaelogapp

Resources