join across databases with nhibernate - database

I am trying to join two tables that reside in two different databases. Every time, I try to join I get the following error:
An association from the table xxx refers to an unmapped class.
If the tables are in the same database, then everything works fine.

I am not sure about this error. But I have also faced problems in joining tables from different databases in a procedure. I created temp table in one of the databases and then inserted data from the other table and did join using temp table with table in the same database.

Related

SSIS - Looking Up Records from Different Databases

I have a source table in a Sybase database (ORDERS) and a Source table in an MSSQL Database (DOCUMENTS). I need to query the Sybase database and for each row found in the ORDERS table get the matching row(s) by order number from the DOCUMENTS table.
I originally wrote the SSIS package using a lookup transformation, simple, except that it could be a one-to-many relationship, where 1 order number will exist in the ORDERS table but more than 1 documents could exist in the DOCUMENTS table. The SSIS lookup will only match on first.
My 2nd attempt will be to stage the rows from the ORDERS table into a staging table in MSSQL and then loop through the rows in this table using a FOR EACH LOOP CONTAINER and get the matching rows from the DOCUMENTS table, inserting the DOCUMENTS rows into another staging table. After all rows from ORDERS have been processed I will write a query to join the two staging tables to give me my result. A concern with this method is that I will be opening and closing the DOCUMENTS database connection many times, which will not be very efficient (although there will probably be less than 200 records).
Or could you let me know of any other way of doing this?

TSQL - Insert from a Table1 DB1 to Table1 DB2

I have two DBs with identical structures and tables, and they both contain some identical and some different data.
my task is to transport all the records from DB1 to DB2, but I do not have to delete all the data present in DB2 before, I must first check if in DB2 there are different records from DB1, not present in DB1. If there is a different record in DB2 it should not be deleted. The problem that starting from the insert extrapolated from the DB1, can have the same IDs to the DB2 and therefore the insert command will give the duplicate key error.
I'm carrying out a manual operation, very onerous in terms of time:
With excel, in a card insert the result of the select of a table of the DB1, and in another card the result of the same table but of the DB2.
Subsequently, with the vertical search, I verify the data that are more on DB2. If I find them, group them and change the id (seeing the max id of the DB1 table), then I check the weak or related tables connected to them and change the external id to the strong table to which I changed the id and also changing the id of weak and relational tables, as done previously. all this with "update table set id = 15000 where id = 101"
Finally, put the extra records in place, do a delete of the rest and execute the inserts taken from the DB1. (all right)
But done this for every strong table, which in turn has "n" weak and relational tables is a massacre. (if done on one or two tables ok, but since it is happening to me often I need something a little more automated)
Do you have any info to give me?
Thanks in advance

Postgres: How can we preserve data of foreign table which is creating using foreign data wrapper

I am trying to migrate Oracle database to Postgres using foreign_database_wrapper by creating foreign tables.
But since the foreign tables acts like a view of Oracle so at the time of executing any query it fetches data on fly from the Original source and hence it increases the processing time.
As of now, in order to maintain physical data at Postgres end, I am creating table and inserting those data in it.
eg: create table employee_details as select * from emp_det;
where employee_details is a physical table and emp_det is a foreign table
But I felt this process is kind of redundant and time to time we need to manipulate this table(new insertion, updation or deletion)
So if anyone could share some related way where I can preserve these data with some other mode.
Regards,
See the identical Github issue.
oracle_fdw does not store the Oracle data on the PostgreSQL side.
Each access to a foreign table directly accesses the Oracle database.
If you want a copy of the data physically located in the PostgreSQL database, you can either do it like you described, or you could use a materialized view:
CREATE MATERIALIZED VIEW emp_det_mv AS SELECT * FROM emp_det;
That will do the same thing, but simpler. To refresh the data, you can run
REFRESH MATERIALIZED VIEW emp_det_mv;

SQL Normalizing array of tables into multiple new tables

I have a database with 51 tables all with the same schema (one table per state). Each table has a couple million rows and about 50 columns.
I've normalized the columns into 6 other tables, and now I want to import all of the data from those 51 tables into the 6 new tables. The column names are all the same, and so I'm hoping I can automate the process of importing all the data.
I'm assuming what I'll need to do is:
Select the names of all the lists that have the raw schema
SELECT TABLE_NAME
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_SCHEMA = 'raw'
Iterate over all the results
Grab all rows from that table, and SELECT INTO the appropriate cols into the appropriate tables
Delete row from raw table
Is there anything I'm missing? Also, is there any way to have this run on the SQL Server so I don't have to have my SQL Server Management Studio open the whole time?
Yes, obviously, you can automate it with t-sql. But I recommened you to use SSIS in this case. As you say, structure of all tables are the same than you can make some ETL process and then you just change table name in the source. Consecuently, you will have the folowwing advantages:
Solve issue with couple of clicks
Low risk of errors
You will able to use the number of data transformations

Renaming table results in different Query Plan

Probably more information is needed, but this is really odd. Using SQL 2005 I am executing an Inner Join on two tables. If I rename one of the tables (using Alter Table), the resulting Query Plan is significanly longer. There are views on the table, but the Inner Join is using the base table, not any of the views. Does this make sense? Is this to be expected?

Resources