In Geoserver 2.15.1 Postgis datastore not visible even i created postgis datastore previously - postgis

I am setting up a webapp using Geoserver and PostgreSQL. Once I created postgis datastore and configured all layers and layergroups. I didn't even close my computer but when I started again working with Geoserver can't reach the layer preview, I checked and seen that there is no option visible for adding postgis datastore. Before that I installed css and backup&restore extensions, and don't know maybe it is not relevant but my computer closed suddenly because of power off even that ı was able to reach datastore after power off. Additionaly I renamed the datastore name that i created.
I tried to reinstall geoserver and postgis but not fixed.
Here is the error:
Caused by: java.io.IOException: Failed to find the datastore factory for kadikoygis_itrf, did you forget to install the store extension jar?
at org.geoserver.catalog.ResourcePool.getDataStore(ResourcePool.java:535)
at org.geoserver.catalog.ResourcePool.getCacheableFeatureType(ResourcePool.java:916)
at org.geoserver.catalog.ResourcePool.tryGetFeatureType(ResourcePool.java:901)
at org.geoserver.catalog.ResourcePool.getFeatureType(ResourcePool.java:893)
at org.geoserver.catalog.ResourcePool.getFeatureType(ResourcePool.java:878)
at org.geoserver.catalog.impl.FeatureTypeInfoImpl.getFeatureType(FeatureTypeInfoImpl.java:123)
at jdk.internal.reflect.GeneratedMethodAccessor275.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at org.geoserver.catalog.impl.ModificationProxy.invoke(ModificationProxy.java:127)
at com.sun.proxy.$Proxy36.getFeatureType(Unknown Source)
at org.geoserver.wms.map.GetMapKvpRequestReader.checkStyle(GetMapKvpRequestReader.java:1215)
... 102 more

After remove and reinstallation it solved.

Related

IIS Shared Configuration Webfarm - Error when dynamically updating bindings

First time posting a question so apologies for anything I'm doing wrong.
I have a webfarm of 4 IIS servers running windows server 2016 which uses an azure file storage account for its webfiles. It also save its shared configuration files to the same azure file storage account. This webfarm is then behind an azure load balancer.
Everything works fine, until part of the website code adds an IIS binding. This then causes all the servers to display the error below:
Could not load file or assembly 'EntityFramework,...etc" The parameter
is incorrect. (Image attached for full error).
The only way to resolve this error is to clear the asp.net temporary files from the C drive of all the servers and run IISRESET on each box.
Any ideas?
So this was a mystery but the following changes have resolved the issue. I'm not sure which combination has fixed it, but this might help someone who has a similar issue.
Recreate website in IIS using new App pool.
Removed individual IIS bindings and replaced with wildcard (we had a really old-school system before where we had 100s of binding, maybe one of these was corrupted).
Thanks for your help!

Solr documents lost on server restart

Background:
I have a Bitnami Solr image installed on Google Compute Engine
I have a custom core with a customized schema
I had updated the core with approximately 100 documents
Everything was running fine for about 3 weeks. I then decided to restart the server as a part of routine maintenance.
When I restarted, all documents in the core had disappeared. The core is empty. The core configuration is there, the schema configuration is there but the documents are gone.
I also checked the file storage area under solr/mycore/data/index and there isnt much there.
I am a Solr newbie and my usage of it is fairly simple but I am concerned that I may be doing something wrong.
Can someone please advise what could be the error?
Update:
I observed that reloading a core causes all documents in the core to be lost. So I think I may be doing something incorrect in terms of persisting documents
Update 2:
Further reading, I figured out that my autoCommit parameter in solrconfig.xml may not be set right. So I tried fiddling with it. I set maxTime to 1000 milliseconds and changed openSearcher to TRUE.
After doing the above, I tried adding a bunch of documents via the admin console and I got the below error. Am stumped now!!
auto commit error...:java.io.FileNotFoundException: /opt/bitnami/apache-solr/solr/mycore/data/index/_0.fnm (Permission denied)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(Unknown Source)
at java.io.FileOutputStream.<init>(Unknown Source)
at org.apache.lucene.store.FSDirectory$FSIndexOutput.<init>(FSDirectory.java:389)
at org.apache.lucene.store.FSDirectory.createOutput(FSDirectory.java:282)
at org.apache.lucene.store.NRTCachingDirectory.unCache(NRTCachingDirectory.java:247)
at org.apache.lucene.store.NRTCachingDirectory.sync(NRTCachingDirectory.java:182)
at org.apache.lucene.index.IndexWriter.startCommit(IndexWriter.java:4528)
at org.apache.lucene.index.IndexWriter.prepareCommitInternal(IndexWriter.java:3001)
at org.apache.lucene.index.IndexWriter.commitInternal(IndexWriter.java:3104)
at org.apache.lucene.index.IndexWriter.commit(IndexWriter.java:3071)
at org.apache.solr.update.DirectUpdateHandler2.commit(DirectUpdateHandler2.java:582)
at org.apache.solr.update.CommitTracker.run(CommitTracker.java:216)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(Unknown Source)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Just had a similar issue, I'm using Cloud, make sure zookeeper/conf/zoo.cfg has dataDir set to something outside of temp/ (this is used in many of the examples). temp is deleted on restart for many linux distributions.
Well, it seems you don't have write permissions on the disk. You should check if the OS user running your Solr instance is allowed to write on disk. Notice I don't know anything about GCE, just check if you have options for managing permissions on the file system in an administration console provided by Google.
Another option would be to move your indexes somewhere else on the file system where you have write permissions.
Make sure you don't have two vhosts in Catalina using the same solr home. I've found that it wipes the index on service stop.

IllegalAccessException on protected class member while parsing Excel 2007 file using Apache POI library on AppEngine

I am trying to parse excel 2007 (.xlsx) file using Apache POI library on Google AppEngine but while doing that I am getting an exception (see below).
java.lang.IllegalAccessException: Class com.google.appengine.tools.development.agent.runtime.Runtime$21 can not access a member of class org.apache.poi.xssf.usermodel.XSSFSheet with modifiers "protected"
So I checked with Apache POI team, but they claim that its an AppEngine issue. I am not sure what is the right place for AppEngine questions, but I know lot of appengine developers monitor Stackoverflow. So posting this question here.
Bug filed for Apache POI team : https://issues.apache.org/bugzilla/show_bug.cgi?id=55665
This bug has a sample maven project, and instructions to reproduce it.
I am not sure how to attach this zip file here.
If any one knows how to fix this then let me know, or right place to file bug.
The key part of the stacktrace is:
java.lang.IllegalAccessException: Class com.google.appengine.tools.development.agent.runtime.Runtime$21 can not access a member of class org.apache.poi.xssf.usermodel.XSSFSheet with modifiers "protected"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:105)
at com.google.appengine.tools.development.agent.runtime.Runtime$22.run(Runtime.java:488)
at java.security.AccessController.doPrivileged(Native Method)
at com.google.appengine.tools.development.agent.runtime.Runtime.checkAccess(Runtime.java:485)
at com.google.appengine.tools.development.agent.runtime.Runtime.checkAccess(Runtime.java:479)
at com.google.appengine.tools.development.agent.runtime.Runtime.newInstance_(Runtime.java:123)
at com.google.appengine.tools.development.agent.runtime.Runtime.newInstance(Runtime.java:135)
at org.apache.poi.xssf.usermodel.XSSFFactory.createDocumentPart(XSSFFactory.java:60)
I've run into the same issue. I think this is only an issue with the development server. Admittedly, this doesn't fully answer your question but I guess the situation at least isn't as bas as you'd think. To get around the issue I've been developing my POI code in a standard Java project (using dummy data) and then copying it into the App Engine project.
I've logged the issue with Google: https://code.google.com/p/googleappengine/issues/detail?id=11752
If you're interested, in the process of logging the issue, I created a sample project which is also available on App Engine (which works as it's running in the production environment).
Sample project: https://bitbucket.org/bronze/jakarta-poi-issue
App running on production environment: http://bronze-gae-poi-issue.appspot.com/

Embedded Solr on Amazon AWS

Currently, I have developed a web application. In my web application, I used embedded solr server to make indexing. After that I deployed onto the Tomcat 6 on window xp. Everything is ok. Next, I have tried my web application to deploy on Amazon AWS. My platform is linux + mysql. When I deployed, I got the exception related with embedded solr.
[ WARN] 19:50:55 SolrCore - [] Solr index directory 'solrhome/./data/index' doesn't exist. Creating new index...
[ERROR] 19:50:55 CoreContainer - java.lang.RuntimeException: java.io.IOException: Cannot create directory: /usr/share/tomcat6/solrhome/./data/index
at org.apache.solr.core.SolrCore.initIndex(SolrCore.java:403)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:552)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:480)
So how to fix my problem. I am novie to linux.
My guess is that the user you are running Solr under does not have permission to access that directory.
Also, which version of Solr are you using? Looks like 3+. The latest version is 4, so it may make sense to try using that from the start. Probably a bit more troubleshooting to start, but a much better pay off that starting with legacy configuration.
I got solution. That is because of permission affair on Amazon Linux with ec2-user. So , I changed permission by following.
sudo chmod -R ugo+rw /usr/share/tomcat6
http://wiki.apache.org/solr/SolrOnAmazonEC2strong text
t should allow access to ports 22 and 8983 for the IP you're working from, with routing prefix /32 (e.g., 4.2.2.1/32). This will limit access to your current machine. If you want wider access to the instance available to collaborate with others, you can specify that, but make sure you only allow as much access as needed. A Solr instance should not be exposed to general Internet traffic. If you need help figuring out what your IP is, you can always use whatismyip.com. Please note that production security on AWS is a wide ranging topic and is beyond the scope of this tutorial.

Installation Error in DotNetNuke

I am new in dotnetNuke. I am trying to install DotNetNuke_Community_05.06.02_Source.zip file.Firstle i extract it to C:\DotNetNuke. There is a release.config file created in C:\DotNetNuke\WebSite\ . Renamed it as we.config. There is another we.config file in C:\DotNetNuke\DotNetNuke_Community_05.06.02_Source\Modules\RazorHost\ renamed it to web1.config. I configured it in IIS6.0 .Created a Database in SQLServer2005 named -DotNetNuke.Changed the connection string as directed by the installation guide. When i am trying to open the installtion wizard through browser it shows an error
Server Error in '/' Application.
Configuration Error
Description: An error occurred during
the processing of a configuration file
required to service this request.
Please review the specific error
details below and modify your
configuration file appropriately.
Parser Error Message: It is an error
to use a section registered as
allowDefinition='MachineToApplication'
beyond application level. This error
can be caused by a virtual directory
not being configured as an application
in IIS.
Source Error:
Line 56: Line 57:
Line 58:
validationKey="F9D1A2D3E1D3E2F7B3D9F90FF3965ABDAC304902"
Line 60:
decryptionKey="F9D1A2D3E1D3E2F7B3D9F90FF3965ABDAC304902F8D923AC"
In order to use the source package you must compile the code in release mode before attempting to access the website.
Being new to DNN, I recommend starting with the install package which will not need compiled, or even easier, you can get DNN through the Microsoft Web Platform Installer. WPI will also take care of all the dependnacies that may need configured on your computer.
I'ld like to clarify the intention behind downloading the Source Package. Were you intending to start developing and changing the dotnetnuke framework to suit your needs or did you want to set up a website that you could build modules against on IIS?
If you're intending to develop modules, I'ld suggest downloading the Install Package, creating an IIS site under Default Web Site.
- There's no need to change connection strings if you're using the database file that's in the App_Data folder.
- All you need to do is set the Folder Permissions for Network Service or IIS_IUSRS based on what application pool you're running.
- The url to the site will be http://localhost/xxx.
There are 2 ways to build modules in DNN. The Website Project and the Web Application are two different ways. They produce source code and dll's respectively. If it's internal, any way's good. If it's external facing, you might want a little bit more speed, so go Web Application.
However, If you're intending to muck around with the DNN Core Framework and adjust things, the Source is the way to go. there's no need to adjust the settings, just open the website project up in Visual Studio and away you go.

Resources