Can I test Solr through a DropBox shared folders? - database

I am new to using Solr and I am struggling with testing the search for the first time. I see in online tutorials people have many different means of testing their cores. Can I test a core on a DropBox shared folder? If so How?
This going to be a search engine for a website that will have articles blog posts and other references.

DropBox is an simple storage, not a place where you can execute something.
You need an environment, where you cat run the solr server. For example an linux server.
You can put your solr home and your solr binary into an dropbox folder in order to mount this folder to an server. Than you can execute/run the solr service at the machine, where the dropbox folder is mounted.

Related

How do I dynamically generate a sitemap with Google App Engine

My website changes every day - I run a news website with new stories every day. I want Google to index my site as often as possible and want/need to autogenerate the sitemap.
I use Google App Engine (with Node.js) to run my site. With GAE - I do not have write-access to the root directory. To post the site map - I need to re-deploy my whole site after generating the map. That is an unnecessarily complex step.
I have searched far and wide and cannot see how to save my sitemap. So - I considered using a static one with a dynamically generated child that I store in another location where I have write access. Google says it wants all linked sitemaps in the same directory. So that appears to be a dead-end.
Can I use "App Deploy" in such a way that only the sitemap is uploaded? Any other possibilities? Appreciate any and all suggestions. It seems unlikely that Google didn't provide some way to solve this.
For a site where new URLs are being created regularly (like a news, blog site, etc), don't 'store' your sitemap. It should be generated on demand i.e. your App should include code to generate the content when the link <your_website>/sitemap.xml is loaded.
Separately, you should note that gcloud app deploy doesn't always deploys all your files. It usually deploys only files that have changed. You can easily confirm this by running the deploy command, changing a single file and then running the deploy command again. You will see that the logs will say something like - Uploading 1 files to Google Cloud Storage and the deploy will be faster. You can change X number of files, deploy again and the message will be updated to indicate it is only deploying x files.
However, I'm not sure what it uses to compute the diff. Maybe it compares it to the files currently in your staging bucket and if the files in the staging bucket have been deleted (they have a default life span of 15 days) it will deploy all the files again (but as I said, I'm not sure of this)

CakePHP v2.x - More then one set of config files

I have "inherited" legacy CakePHP application which should be now deployed to multiply clients. I have search through the documentation for the version 2.x, but I was unable to find the following:
How can I easily store the different Config files for different clients? The reason is simple, each client will have different database details, different email setup ...
Should I have a default folder and for the other make a duplicate folder named Config_another_client_name folder and inside have a copy of those files? Does anyone have a better idea?

Looking for the best way to store files between two web applications running on docker

we have two symfony 4 application (web app & rest api) running on two different docker container on the same "real" server (for now).
The files (like picture or documents) are stored in the web app public directory for the moment but we want to share this files to the rest api as well.
What is the best way to share them ?
Store them on the docker parent server ? Build a new web server for this files and access them by sftp ?
We want to have all access (by the two app) on the new file system, like make directory, put files, delete files and directories.
As I understand you're using 1 docker container for web app + 1 container for rest api. To store files (images, documents ...) on web app public directory is not a good solution.
If you decide to have more replicas (more containers) of web app you'll have trouble.
One solution will be to have a service where to store all you're application files and use it by all your app containers.
You can you Minio Server for Docker to create this service and in your symfony application use that minio server service for all your application instances. See here an example >> How to use AWS SDK for PHP with Minio Server
Consider to use a distributed file system such as GlusterFS, that should fit your needs.
All container can have the shared volume write and read from it. And also it works great on swarm.

Creating a local environment from an existing GAE installation

I have a website that is currently running under GAE... unfortunately, I, nor anyone on the team, does not have access the local environment that it was created from.... Is it possible to create a local environment or at least get a copy of the application files and database from an existing GAE installation?
What you need is the application source code, not the "local environment".
Ideally this source code would be on a version control system (ie GIT,SVN), Google cloud platform provides free GIT repositories for your projects so you might try looking there first. There's also a tool for both Java and python that allow you to download the source of a deployed version, provided you are authenticated as either the dev who uploaded it or a project owner. EDIT: as stated by Dan Cornilescu this feature can be disabled.
As for the database info there's plenty of tools available to "export" your GAE datastore info, just consider for your project that it might be easier to do the queries manually than actually implementing this tools.
Thanks for help... But unfortunately, this code is not in GIT. Furthermore,
being new to Google hosting, I wasn't clear on my setup... My web instance is actually running within Compute Engine not Application Engine. Be that as it may, with some additional search, I was first able to find out how to browse my filesystem by accessing the VM Instances menu option under the Compute Engine section of the Google Cloud Platform interface. On the VM Instances page, it will show your instance and an option to the left side of the instance to connect with a drop down box that will allow you to open a browser window that shows the instance's file system. In addition to this, I found this link https://www.youtube.com/watch?v=9ssfE6ODpak that shows how to configure Filezila FTP client to access your server instance - very helpful. From there, I was able to download all of my site files from the var/www directory. Now, onto extracting my data... Thanks again!

Solr Cloud Managed Resources

I am implementing Solr Cloud for the first time. I've worked with normal Solr and have that down pretty well, but I'm not finding a lot on what you can and can't do with Solr Cloud. So my question is about Managed Resources. I know you can CRUD stop words and synonyms using the new RESTful api in solr. However with the cloud do I need to CRUD my changes to each individual solr server in the cloud, or do I send them to a different url that sends them through to each server? I'm new to cloud and zookeeper. I have not found anything in the solr wiki about working with the managed resources in the cloud setup. Any advice would be helpful.
In SolrCloud configuration and other files like stopwords, are stored and maintained by Zookeeper. Which means you do not need to individually send updates to each server.
Once you have SolrCloud, before putting in any data, you will create a collection. Each collection has its own set of resources/config folder.
So for example if u have a collection called techproducts with 2 servers localhost1 and localhost2 the below command from any of the servers will work on the same resource.
curl "http://localhost1:8983/solr/techproducts/schema/analysis/synonyms/english"
curl "http://localhost2:8983/solr/techproducts/schema/analysis/synonyms/english"

Resources