How to restore postgreSQL database for odoo to google cloud? - database

$ curl -F 'master_pwd=superadmin_passwd' -F backup_file=https://storage.cloud.google.com/smart02-bucket-1/Smart.zip -F 'copy=true' -F 'name=Smart1' http://34.91.12.120/web/database/
restore
After executing this line this is the result
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>400 Bad Request</title>
<h1>Bad Request</h1>
<p>Session expired (invalid CSRF token)</p>

Related

Solarium return Solr HTTP error : OK (404)

I use Solarium to access Solr with Symfony. It works without problem on my computer and dev computer but not on prod server.
On the prod server, Sorl is running with the same configuration, same port, same logins.
Do you have any idea of what can be the problem?
Here is the error
Solr HTTP error: OK (404)
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Not Found</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Not Found</h2>
<hr><p>HTTP Error 404. The requested resource is not found.</p>
</BODY></HTML>
Problem solved. There was a wrong proxy installed on the windows server.

Google App Engine 500 internal server error on post requests

I'm create Google App Engine instance with code based on vertx and try to habdle post requests.
If i launch my application locally - everything work ok, but if i try to POST request to installed app engine host - all my requests ends with 500 internal server error.
Why? Maybe i forget something?
For example:
curl -w "%{http_code}" -d "url=test" -X POST "http://localhost:80/hashposttest"
Use url: http://nifty-memory-w307.appspot.com/hashget/200okkk#Sprite:~$
curl -w "%{http_code}" -d "url=test" -X POST "https://nifty-memory-268407.appspot.com/hashposttest"
<html><head>
<meta http-equiv="content-type" content="text/html;charset=utf-8">
<title>500 Server Error</title>
</head>
<body text=#000000 bgcolor=#ffffff>
<h1>Error: Server Error</h1>
<h2>The server encountered an error and could not complete your request.<p>Please try again in 30 seconds.</h2>
<h2></h2>
</body></html>
500

What SOLR configuration is required to fetch an html page and parse it?

I've been consulting one tutorial after another and have spent oodles of time searching.
I installed SOLR from scratch and start it up.
bin/solr start
I successfully navigate to the SOLR admin. Then I create a new core.
bin/solr create -c core_wiki -d basic_configs
I look at the help for the bin/post command.
bin/post -h
...
* Web crawl: bin/post -c gettingstarted http://lucene.apache.org/solr -recursive 1 -delay 1
...
So I try to make a similar call... but I keep getting a FileNotFound error.
bin/post -c core_wiki http://localhost:8983/solr/ -recursive 1 -delay 10
/usr/lib/jvm/java-7-openjdk-amd64/jre//bin/java -classpath /home/ubuntu/src/solr-5.4.0/dist/solr-core-5.4.0.jar -Dauto=yes -Drecursive=1 -Ddelay=10 -Dc=core_wiki -Ddata=web org.apache.solr.util.SimplePostTool http://localhost:8983/solr/
SimplePostTool version 5.0.0
Posting web pages to Solr url http://localhost:8983/solr/core_wiki/update/extract
Entering auto mode. Indexing pages with content-types corresponding to file endings xml,json,csv,pdf,doc,docx,ppt,pptx,xls,xlsx,odt,odp,ods,ott,otp,ots,rtf,htm,html,txt,log
Entering recursive mode, depth=1, delay=10s
Entering crawl at level 0 (1 links total, 1 new)
SimplePostTool: WARNING: Solr returned an error #404 (Not Found) for url: http://localhost:8983/solr/core_wiki/update/extract?literal.id=http%3A%2F%2Flocalhost%3A8983%2Fsolr&literal.url=http%3A%2F%2Flocalhost%3A8983%2Fsolr
SimplePostTool: WARNING: Response: <html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/core_wiki/update/extract. Reason:
<pre> Not Found</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
SimplePostTool: WARNING: IOException while reading response: java.io.FileNotFoundException: http://localhost:8983/solr/core_wiki/update/extract?literal.id=http%3A%2F%2Flocalhost%3A8983%2Fsolr&literal.url=http%3A%2F%2Flocalhost%3A8983%2Fsolr
SimplePostTool: WARNING: An error occurred while posting http://localhost:8983/solr
0 web pages indexed.
COMMITting Solr index changes to http://localhost:8983/solr/core_wiki/update/extract...
SimplePostTool: WARNING: Solr returned an error #404 (Not Found) for url: http://localhost:8983/solr/core_wiki/update/extract?commit=true
SimplePostTool: WARNING: Response: <html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/core_wiki/update/extract. Reason:
<pre> Not Found</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
Time spent: 0:00:00.041
I'm still fairly new to SOLR indexing. Any hints that could point me in the right direction would be appreciated.
It seems that the request handler named /update/extract is missing from your configuration.
The ExtractingRequestHandler is not incorporated into the solr war
file, it is provided as a SolrPlugins, and you have to load it (and
it's dependencies) explicitly. (Apache Solr Wiki)
It should be defined in solrconfig.xml, like :
<requestHandler name="/update/extract" class="org.apache.solr.handler.extraction.ExtractingRequestHandler">

Indexing .tar.gz files in Solr 5.3.1: HTTP Error 405 POST not supported

Under a Solr 5.3.1 installation with /update working as expected I tried to index a .tar.gz file with the update/extract query handler,
curl "http://localhost:8983/solr/#/myfirstcore/update/extract?literal.id=adocument&commit=true" -H 'Content-type:application/octet-stream' --data-binary "#encapsulate.tar.gz"
But receive the following,
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 405 HTTP method POST is not supported by this URL</title>
</head>
<body><h2>HTTP ERROR 405</h2>
<p>Problem accessing /solr/admin.html. Reason:
<pre> HTTP method POST is not supported by this URL</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
Under the admin panel, the update/extract specification is
/update/extract
class:org.apache.solr.handler.extraction.ExtractingRequestHandler
version:5.3.1
description:Add/Update Rich document
src:null
And solr was generally installed according to these directions: Digital Ocean: Installing Solr 5.2.1 on Ubuntu 14.4
Given the above error message how can I configure Solr to index zipped files (including .tar.gz)? The use case is to associate content with taxonomy metadata stored in json format by zipping them together. This way Solr will index both documents and associated taxonomy metadata together and follow on partial update commands are not needed.
Solution, change:
curl "http://localhost:8983/solr/#/myfirstcore/update/extract?literal.id=adocument&commit=true" -H 'Content-type:application/octet-stream' --data-binary "#encapsulate.tar.gz"
to
curl "http://localhost:8983/solr/myfirstcore/update/extract?literal.id=adocument&commit=true" -H 'Content-type:application/octet-stream' --data-binary "#encapsulate.tar.gz"
And a query for id=adocument returns 1 hit. That it didn't pick up the fields is a separate issue.

Solr cloud error while creating collection no config file found

I am using external zookeper for testing I am using on local system steps I followed are as bellow.
Step 1. created 3 zookeper server with data containing myid file containing unique numbers 1,2,3 respectively.
Step 2. I started all three zookeper server useing command
./zkServer.sh start
Step 3. check status of each server 2 showing status as leader and remaining 2 as Mode: follower
Step4 : try to run solr cloud example as
/opt/solr$bin/solr start -e cloud -z localhost:2181,localhost:2182,localhost:2183
It ask me no of shards replicas etc. System ask me collection name I entered test but it throws an exception like
`basic_configs, data_driven_schema_configs, or sample_techproducts_configs [data_driven_schema_configs]
Exception in thread "main" org.apache.solr.client.solrj.SolrServerException: Error loading config name for collection phrases
at org.apache.solr.util.SolrCLI.getJson(SolrCLI.java:537)
at org.apache.solr.util.SolrCLI.getJson(SolrCLI.java:471)
at org.apache.solr.util.SolrCLI$StatusTool.getCloudStatus(SolrCLI.java:721)
at org.apache.solr.util.SolrCLI$StatusTool.reportStatus(SolrCLI.java:704)
at org.apache.solr.util.SolrCLI.getZkHost(SolrCLI.java:1160)
at org.apache.solr.util.SolrCLI$CreateCollectionTool.runTool(SolrCLI.java:1210)
at org.apache.solr.util.SolrCLI.main(SolrCLI.java:215)
Enabling auto soft-commits with maxTime 3 secs using the Config API
POSTing request to Config API: http://localhost:8990/solr/sai/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}
Exception in thread "main" org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://localhost:8990/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/sai/config. Reason:
<pre> Not Found</pre></p><hr><i><small>Powered by Jetty://</small></i><hr/>
</body>
</html>
at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:529)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:235)
at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:227)
at org.apache.solr.client.solrj.SolrClient.request(SolrClient.java:1220)
at org.apache.solr.util.SolrCLI.postJsonToSolr(SolrCLI.java:1882)
at org.apache.solr.util.SolrCLI$ConfigTool.runTool(SolrCLI.java:1856)
at org.apache.solr.util.SolrCLI.main(SolrCLI.java:215)
SolrCloud example running, please visit http://localhost:8990/solr `
It showing error Exception in thread "main" org.apache.solr.client.solrj.SolrServerException: Error loading config name for collection phrases and I am trying to create collection test
If you are trying to create a collection on solr cloud with your custom configuration, you have to upload it first on to Zookeeper. Then you can create a collection using that config. You can also check what all configurations are there currently on the solrcloud through solr admin UI (http://localhost:8983/solr/#/~cloud?view=tree).
Uploading configuration to zookeeper:
Using zookeeper client in solr (solr-6.5.1\server\scripts\cloud-scripts)
zkcli -zkhost <zookeeper host> -cmd upconfig -confname <configname> -solrhome <solr home directory> -confdir <config directory path>
ex: zkcli -zkhost localhost:2181 -cmd upconfig -confname sampleconfig -solrhome ../solr -confdir ../../solr/configsets/sampleconfig/conf
Now that your configuration is uploaded, you can create a collection on solrcloud
start solr cloud with external zookeeper
solr start -c -z localhost:2181
Create collection
http://localhost:8983/solr/admin/collections?action=CREATE&name=<collectionname>&numShards=1&replicationFactor=1&collection.configName=<configname>
-e cloud is an example provided by SOLR and it works with implicit ZooKeepers.
For explicit ZooKeeper implementation refer either of the below ones
http://amn-solr.blogspot.in/
SolrCloud 5 and Zookeeper config upload

Resources