Loading SQL Server table dump into Postgresql DB? - sql-server

Right now, I have been provided with a database dump of a SQL Server table and I need to load into a postgresql. Are there any good tools to convert the file and then load it? I have already tried pgloader. However, this returns me some really bad coding errors.

Related

SQL Server (not mysql which uses INTO ) - Save to .csv using a query not manual export to function

I want to be able to run a query every day at the same time and have it automatically save to a location with a specific file name.
Is this possible? Without SSIS or cmdlet or PowerShell, I want to do it all within the query I'm running,
I understand INTO outfile is a mysql syntax, is there an alternative for SQL Server?
i understand INTO outfile is a mysql syntax , is there an alternative for ms sql?
I'm assuming you're saying you don't want to use any external tools, including tools that ship with SQL Server itself. You want something you can run as an SQL query to create the file and without creating an SSIS package (even a package from the Import/Export Wizard).
No. There are no T-SQL statements supported by MS SQL Server which themselves cause the database engine to write the output directly to a file.
There is no native MS SQL Server equivalent to MySQL's INTO <outfile>, PostgreSQL's COPY (<query>) TO <outfile>, or Oracle's spool command. As far as MS SQL Server is concerned, writing data to an external file is the entirely responsibility of whatever database client you're using.
The closest you can get would be a CLR function or stored procedure similar to the apparently abandoned https://archive.codeplex.com/?p=sqlclrexport#
I suppose you could also use xp_cmdshell with bcp.exe, but that's kind of cheating around your requirements.

How to Transfer a Log file to Microsoft SQL server management DB table

I am trying to transfer a Log file, specifically from Mirth. Into, a SQL Server database table (MirthTable_L). I am rather new the SQL Server. I have tried BULK INSERT and it doesn't seem to go through. I do believe I am inserting it correctly, I was told I may need a File Reader that loops, and reads row-by-row. Any beginner examples on how to do such would help immensely! Thanks.

how to compare sql server data with MS Access data

Working on a data data accuracy project. I have to find a way to compare data from a query from a SQL Server db with the data from a query from a MS Access db. The data on both db's should be identical, but sometimes there are errors. I have looked at data comparison tools but these seem to only be able to compare data from identical db vendors.
Is there a process that someone has used in the past to do this or an idea on how I might best approach this?
You can look at both data sets in Access, SQL, or Excel:
If the data set is small enough, I recommend Excel.
If you know SQL, you can export your Access data to text files, then do a Bulk Insert and get everything into SQL Server.
If you want to look at both data sets in Access, try this:
Go to your ODBC Data Source Administrator (searching for 'ODBC' from your Start menu should be sufficient)
Create a new System DSN connecting to your SQL Server db
Open your Access db (I'm using 2010, your version may be different)
Go to External Data->ODBC Database->Machine Data Source
Link to your tables of choice from your SQL Server
Query away!

Move data from SQL Server to MS Access mdb

I need to transfer certain information out of our SQL Server database into an MS Access database. I've already got the access table structure setup. I'm looking for a pure sql solution; something I could run straight from ssms and not have to code anything in c# or vb.
I know this is possible if I were to setup an odbc datasource first. I'm wondering if this is possible to do without the odbc datasource?
If you want a 'pure' SQL solution, my proposal would be to connect from your SQL server to your Access database making use of OPENDATASOURCE.
You can then write your INSERT instructions using T-SQL. It will look like:
INSERT INTO OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0','Data Source=myDatabaseName.mdb')...[myTableName] (insert instructions here)
The complexity of your INSERTs will depend on the differences between SQL and ACCESS databases. If tables and fields have the same names, it will be very easy. If models are different, you might have to build specific queries in order to 'shape' your data, before being able to insert it into your MS-Access tables and fields. But even if it gets complex, it can be treated through 'pure SQL'.
Consider setting up your Access db as a linked server in SQL Server. I found instructions and posted them in an answer to another SO question. I haven't tried them myself, so don't know what challenges you may encounter.
But if you can link the Access db, I think you may then be able to execute an insert statement from within SQL Server to add your selected SQL Server data to the Access table.
Here's a nice solution for ur question
http://www.codeproject.com/Articles/13128/Exporting-Data-from-SQL-to-Access-in-Mdb-File

Best way to migrate export/import from SQL Server to oracle

I'm faced with needing access for reporting to some data that lives in Oracle and other data that lives in a SQL Server 2000 database. For various reasons these live on different sides of a firewall. Now we're looking at doing an export/import from sql server to oracle and I'd like some advice on the best way to go about it... The procedure will need to be fully automated and run nightly, so that excludes using the SQL developer tools. I also can't make a live link between databases from our (oracle) side as the firewall is in the way. The data needs to be transformed in the process from a star schema to a de-normalised table ready for reporting.
What I'm thinking about is writing a monster query for SQL Server (which I mostly have already) that will denormalise and read out the data from SQL Server into a flat file using the sql server equivalent of sqlplus as a scheduled task, dump into a Well Known Location, then on the oracle side have a cron job that copies down the file and loads it with sql loader and rebuilds indexes etc.
This is all doable, but very manual. Is there one or a combination of FOSS or standard oracle/SQL Server tools that could automate this for me? the Irreducible complexity is the query on one side and building indexes on the other, but I would love to not have to write the CSV dumping detail or the SQL loader script, just say dump this view out to CSV on one side, and on the other truncate and insert into this table from CSV and not worry about mapping column names and all other arcane sqlldr voodoo...
best practices? thoughts? comments?
edit: I have about 50+ columns all of varying types and lengths in my dataset, which is why I'd prefer to not have to write out how to generate and map each single column...
"The data needs to be transformed in the process from a star schema to a de-normalised table ready for reporting."
You are really looking for an ETL tool. If you have no money in the till, I suggest you check out the Open Source Talend and Pentaho offerings.

Resources