Apache SOLR index on remote server - solr

I want to be able to run a SOLR instance on my local computer by have the index directory on a remote server. Is this possible ?
I've been trying to look for a solution for days. Please help.
Update: We've got a business legal requirement where we are not allowed store client data on our servers ... we can just read, insert, delete and update it on Client request via our website and the data has to be stored on client servers. So each client will have their own index and we cannot run SOLR or any other web application on Client's server. Some of the clients have dropbox business account. So we thought may be just having the SOLR index file upload to dropbox might work.

enable remote streaming in solrConfig.xml and configure the remote file location in it.
It's working

Related

How to get database off of localhost and running permanently?

So not sure it this is stupid to ask, but I'm running a neo4j database server (using Apollo server) from my React Application. Currently, I run it using node in a separate terminal (and I can navigate to it on localhost), then run npm start in a different terminal to get my application going. How can I get the database just up and running always, so if customers use the product they can always access the database? Or, if this isn't good practice, how can I establish the database connection while I run my client code?
Technologies being used: ReactJS, Neo4j Database, GraphQL + urql
I tried moving the Apollo server code into the App.tsx file of my application to run it from there directly when my app is launched, but this was giving me errors. I'm not sure if this is the proper way to do it, as I think it should be abstracted out of the client code?
If you want to run your server in the cloud so that customers can access your React application you need two things:
one server/service to run your database, e.g. Neo4j AuraDB (Free/Pro) or other Cloud Marketplaces https://neo4j.com/docs/operations-manual/current/cloud-deployments/
A service to run your react application, e.g. netlify, vercel or one of the cloud providers (GCP, AWS, Azure) that you then have to configure with the server URL + credentials of your Neo4j server
You can run neo4j-admin dump --to database.dump on your local instance to create a copy of your database content and upload it to the cloud service. For 5.x the syntax is different, I think neo4j-admin database dump --path folder.

How can I Connect google sheets to SQL Server via SSL via appscript

How can you securely transfer data from SQL Server (Intranet) to google appscript?
I have made the connection to the SQL Server Instance, but im afraid of someone pickup the raw data when traveling to google servers.
Any recomendation how to accomplish this.
Basically the appscript is querying our internal database successfully, we just need a kind of encrytion. we have tried to upload .csv files and gdrive and import data to sheets, but some how data always get corrupted.
There is no direct guide from Apps Script to do this. Try the How to Encrypt your Gmail Messages with Google Docs from Armit Agarwal. See if you can also use SJCL encryption library in your situation.

Create MySQL database on server

I'm following a tutorial for constructing a PHP and MySQL ecommerce driven website, and I'm uploading them to my server at the moment, but in need of some assistance determining how to proceed.
In the README of the tutorial, are the following instructions:
INSTALLATION INSTRUCTIONS
1.) Unzip plaincart.zip to the root folder under your
HTTP directory ( or under your preferred directory)
2.) Create a database and database user on your web
server for Plaincart
3.) Use the sql dump in plaincart.sql to generate the
tables and example data
4.) Modify the database connection settings in
library/config.php.
5.) If you want to accept paypal modify the settings
in include/paypal/paypal.inc.php . More information
about this paypal stuff can be found in
http://www.phpwebcommerce.com/shop-checkout-process/
OK, so I obviously am capable enough to complete #1! :)
So, on to number 2, how to I create a database on my server?
I understad number 3, referring to the fact that I use the SQL dump file to construct some sample data once the database has been created.
I can't tell about #4 and #5 yet, but we'll see when we get there.
So, I guess I just need to know how to construct a MySQL database on my web server.
Easiest way: install phpmyadmin on the remote server, and do it from that web interface.

How to transfer databases and site contents

I own a website with 20 GB data on it Now I decided to change the Hosting compnay .
I'm Moving to Russian VPS so is there a way to transfer the contents of my website to the Russian VPS without uploading them again .
Is there a service that does this.
I heard that there is a way to do this using shell access (BUT what is shell access and how it works)
thanx in advance guys
You can log in to one of your old host using an SSH connection, then connect from there to your new host, again using an SSH connection, and then upload all files from your first server to the second. For databases, do a data dump on your first server, and through the SSH connection, run the data dump against a database on your new server.
Depending on the hosts, how you connect via SSH will differ, but there should be instruction available from the providers. If you can't find the directions, just e-mail the provider's support and ask.
If you have access to the server itself, you can ftp into your old site from the new server, and download all the data from the new server, without having to download to a personal computer.
If your current provider supports FTP, you can issue FTP commands from your new VPS to the current FTP site. If your data is in DB - backup and transfer backup.
You can't avoid 40Gb of transfer (20 out from old site and 20 in new one).
This is one of the reasons that makes Amazon S3 a good thing.

Uploading data into remote database

What is the most secure and easier way to send approx. 1000 different records into database that is not directly accessible - MySQL database on Web provider's server - using Windows application
.
Data will be stored into different tables.
Edited:
The application will be distributed to users who have no idea what is database or putty or... They just install my application, open it, enter some data and press Submit.
Currently I'm using php to upload the generated script into webserver and there process it. I think I should also include some signature to the file to avoid some "drop..." hacks.
If you can export the data as a sql script you can just run it against the remote server using your application of choice. 1000 records wont create that big a script.
In current project on my job we have the same situation - remote (faraway) database.
I made next solution: serialization sql query into xml and putting it via HTTP to web daemon, which is running on remote server instead of open sql server. Daemon checks credentials and executes query.
As I can't execute any external programs on external server, I created following solution:
My program creates script file and calculates it's salted hash
Program sends this file together with user credentials and hash into PHP page on the server
PHP page checks the username and password, then checks hash and then executes script. Only Insert and Update commands are allowed.
Is this approach secure enough?

Resources