Solr SQL for Single Solr Instance - solr

I was excited to hear that Solr 6 had an SQL interface, but soon found that it only works with SolrCloud and not a single Solr instance. We currently have two Solr servers. One is a master production server and it is replicated to a slave reporting server. I would love to be able to use SQL on the slave.
So a couple questions.
Is it actually possible to use SQL on a single Solr instance and I just missed something?
If I need to use SolrCloud for SQL, how can I set that up and maintain a similar architecture to what I have now? That is, I only have two hosts, all production traffic including writes go to one host and all background reports go to the other host.
I welcome any other suggestions you might have.

This blog explains how SQL is integrated, and it looks as if you cannot do it with just Solr.
https://sematext.com/blog/2016/04/18/solr-6-solrcloud-sql-support/

Related

Is there a simple solution to replicate data from SQL Server Azure to PostgreSQL Azure?

I need to regularly (but incrementally) sync (one way) the contents of a set of SQL Server Azure tables to a PostgreSQL Azure instance.
Here are some of the avenues I've considered:
Linked server from SQL Server. No go. Apparently Azure doesn't support linked servers.
Foreign Data Wrapper from PostgreSQL. No go. PostgreSQL on Azure only supports the postgres_fdw, not the needed tds_fdw.
Azure Data Factory. No go. The data copy process doesn't work incrementally, and the sink pipeline component doesn't support PosgreSQL.
Commercial replication solutions. Too expensive for a startup and most aren't hosted.
SymmetricDS or ReplicaDB. These might work, but aren't hosted so we may or may not save time over building a custom solution after all the time and effort of configuration and debugging.
Am I missing an obvious solution?
Congratulations, you solved your problem. It will be better that if you could share us more detail about your simple replication system.

Creating a Database Software needed

Hi can someone explain to me what the following database software are:
HeidiSQL, MariaDB and XAMPP.
Also do they depend on each other to work? Hope you can help.
Thank You
MariaDB:
Is a open-source database server, basically its a fork of MySQL and they provide support and services, its slightly better than MySQL since they have made some changes to MySQL. MariaDB doesn't depend on anything. Its a complete database product.
HeidiSQL:
Its a fronend-MySQL, its nothing but a UI for interacting with different database servers like MariaDB, Percona, and many more depending on their enhancement. HeidiSQL is just a frond end, If you want to interact with databases, you must connect to one. Yes, heidiSQL is dependent on databases. Latest enhancements support MS SQL Server, which is great :)
XAMPP:
Its like a tech stack similar to LAMP, WAMP etc. XAMPP means Apache HTTP Web Server, MariaDB, PHP, and Perl. X stands for cross platform. XAMPP is dependent on many as you can see its a tech stack.

Azure availability set: IIS + SQL Server

Can someone please explain me what I need to do in order to make my websites available all the time using Microsoft azure.
At the moment I have just one dedicated server with IIS (running 7 websites) and SQL Server - all on one machine. Beside this I use Redis-lab as cloud service for hosting Redis cache.
I'm more or less happy how this works, but in case that something happens with server, or I need to restart it of course my website goes down which is not good of course.
So in order to make mitigate some of risks what exactly I need to do?
Am I correct in flowing thinking?
Option 1 - I need one more machine in an availability set with load balancer. This solution is not great as one server will still have an instance of SQL Server running = if that server goes down, websites on the second server will not work as the database is down
Option 2 - I need 3 more servers. 2 for IIS in a load balanced environment and 2 for SQL Server - which is super expensive solution.
Option 3 - 2 more servers. Where existing server and the new one will be for IIS (load balanced) + 3rd server with database. The database server will be write only. Both IIS severs will have an instance of ms SQL running in readonly mode => content from database server will be replicated to their databases. In this scenario if SQL Server goes down websites will still work as they will pull data from their own read only databases
Are there any other options?
Thanks
Regarding other options have you considered the option of moving the databases to Azure SQL which would give you redundancy out of the box? Similarly if you can move the websites to Azure App Service you can get the same for the sites.
Yes, you definitely need the availability set for your deployment. Please, take a look at Azure availability checklist written by Microsoft.
I would propose you to migrate your web apps to the Azure Web Apps + set up the SQL Server deployment according to the availability best practices. Migrating them to the web apps as a service will eliminate some administrative tasks and the problem of placing all of the eggs in the same basket. You can place them to the one Web Apps Pricing Plan and change that plan when needed, for example, from more powerful resources to least powerful (or from the paid one to the free one for all of your sites).
If SQL Azure is not a solution for you, and (from my point of view) the data source is more critical than frontent/.../, it is highly recommended to deploy SQL Sever according to the tutorials provided above.

Convert SQL Server queries to Postgres on the fly

I have a scenario where I get queries on a webservice that need to be executed on a database.
The source for these queries is from a physical device so I cant really change the input to my queries.
I get the queries from the device in MSSQL. Earlier the backend was in SQL Server, so things were pretty straight forward. Queries would come in and get executed as is on the DB.
Now we have migrated to Postgres and we don't have to the option to modify the input data (SQL queries).
What I want to know is. Is there any library that will do this SQL Server/T-SQL translation for me so I can run the SQL Server queries through this and execute the resulting Postgres query on the database. I searched a lot but couldn't find much that would do this. (There are libraries that convert schema from one to another but what I need is to be able to translate SQL Server queries to Postgres on the fly)
I understand there are quite a bit of nuances that will be different between SQL and postgres so a translator will be needed in between. I am open to libraries in any language(that preferably runs on linux : ) ) or if you have any other suggestions on how to go about this would also be welcome.
Thanks!
If I were in your position I would have a look on upgrading your SQL Sever to 2019 ASAP (as of today, you can find on Twitter that the officially supported production ready version is available on request). Then have a look on the Polybase feature they (re)introduced in this version. In short words it allows you to connect your MSSQL instance to other data source (like Postgres) and query the data in as they would be "normal" SQL Server DB (via T-SQL) then in the background your queries will be transformed into the native pgsql and consumed from your real source.
There is not much resources on this product (as 2019 version) yet, but it seems to be one of the most powerful features coming with this release.
This is what BOL is saying about it (unfortunately, it mostly covers the old 2016 version).
There is an excellent, yet very short presentation by Bob Ward (
Principal Architect # Microsoft) he did during SQL Bits 2019 on this topic.
The only thing I can think of that might be worth trying is SQL::Translator. It's a set of Perl modules that have been around for ages but seem to be still maintained. Whether it does what you want will depend on how detailed those queries are.
The no-brainer solution is to keep a SQL Server Express in place and introduce Triggers that call out to the Postgres database.
If this is too heavy, you can look at creating a Tabular Data Stream (TDS is SQL Server network transport) gateway with limited functionality and map each possible incoming query with any parameters to a static Postgres query. This limits any testing to a finite, small, number of cases.
This way, there is no SQL Server, and you have more control than with the trigger option.
If your terminals have a limited dialect demand then this may be practical. Attempting a general translation is very likely to be worth more than the devices cost to replace (unless you have zillions already deployed).
There is an open implementation FreeTDS that you could use if you are happy with C or Java.

Querying a remote MSIDXS via T-SQL

I think this question better fits here rather than ServerFault, but if I'm wrong, please correct me.
I have a system which has a database that also queries Windows Indexing Services. The queries are done via T-SQL using the OpenQuery(Linked_Server_Name, ...) function.
When the DB and the Indexing Services are on the same server, everything works perfectly.
Now I need to scale my system up, which means I'll have to separate the DB server from the Indexing Services server. The problem is, I couldn't find a way to remotely query the Indexing Services.
Did anyone succeed with a similar setup?
If no, what alternatives would you suggest?
I had a similar problem in my company and I googled, that remote indexing is impossible. But we found a solution. Now we have one server with DB and another server with IIS and attachments which are indexed. The solution was to share attachments folder (or maybe whole disc) that DB server could see them. Unfortunatly, I changed position in the company and I don't have permissions anymore to connect to servers to check for configuration, so I can't write what and where exactly must be done.
It really does seem impossible to remote query MSIDXS. I ended up writing a web service that wraps the MSIDXS and is called remotely.
Performance aren't as good, though.

Resources