Export Sql Server (64 bit) to Excel (32 bit) - sql-server

I tried to export data from SQL Server to excel, but it does not work.
My code is
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0', 'Excel 12.0 Xml;HDR=YES;
Database=D:\FATXL.xlsx;',
'SELECT * FROM [Sheet1$]')
SELECT * FROM tabel1
Error
The 32-bit OLE DB provider "Microsoft.ACE.OLEDB.12.0" cannot be
loaded in-process on a 64-bit SQL Server.
I need to know if its possible to export data from 64 bit SQL Server to 32 bit MS Excel and how to do it?
Should we have same bit of SQL server and excel for this to work?

Here's are some alternative solutions which never fail.
Alternative solution 1
Caveat: You'd need to run this query manually in the SSMS.
Use Export to save you results in a CSV file which can be opened in Excel and saves as XLS.
Alternative solution 2
Caveat: You'd need to run this query manually in the SSMS.
You can change the Query results display options from grid to file which can be a CSV file and also check the SET NO COUNT option like below
Alternative solution 3
Caveat: Once again need SSMS
Go to Database name and right click to select Tasks> Export from menu which should open up SQL import and Export Wizard. You can save it as package and deploy it under the SSIS catalog and run as scheduled job too.
Alternative solution 4
Write an SSIS package to get data from SQL server and put into Excel.

Related

Automation -File Upload-Microsoft SQL Server Management Studio

I have to weekly upload text files from a server location to Microsoft SQL Server Management Studio .I wish to automate the task so that files are automatically uploaded .Can somebody suggest me the way?
Methods I know of:-
Via SQL:
Use OPENROWSET to open the file and obtain the records to write into
a table.
Use BULK INSERT to open the file and insert directly into a table (you may need to pair with XP_CMDSHELL to get a directory listing to loop through)
VIa SSMS:
Create a DataFlow to import from file
SSMS makes it easier to do clever things with the import process. But it can be very finnicky.
With both of those you can set up an Agent job to run the script / package automatically.

SSIS Excel Destination is Empty

I'm having an issue exporting a large dataset (500k+) to Excel via SSIS, where the output file ends up with 0 rows exported. Before saying that I shouldn't be exporting that many records to Excel, let me state that I know and normally wouldn't. Accounting does not want a CSV and is unwilling to open a CSV in Excel.
Using Visual Studio 2012 SSDT, here are the components involved.
Execute SQL Task -> Creates the empty file with headers
Data Flow Task ->
OLE DB Source -> SQL Query
Excel Destination
While the package is running, you can see records flowing from the source to the destination. The package completes without error, but when you open the file, it's empty. The only thing in there is the header.
If I select the Top 1000 records and export to Excel, it works as intended.
Some things I've tried:
Export to Excel on the network
Export to Excel locally
Export to CSV to Excel on both network and locally
Export to Ole DB Destination using Office Access Database Engine 12.0 with "Excel 12.0" extended properties.
Tried running as different users
All with the same outcome.
Can anyone provide any insight into why this may be happening and how to proceed?
We experienced a similar behaviour, when runnig the ETL in a SQL Server Agent job. Debugging it in Visual Studio, worked, however. So I do not know, whether this solution applies to you.
The reason was that the user, under which the package ran, did not have access to C:\users\Default.
I found this out by using sysinternals process monitor.
I was inspired by that post: Empty Excel File permissions issue: SSIS Excel Destination buffers large record sets through C:\Users\Default
[I explained my search for the bug in my blog: https://www.csopro.de/biblog/2018/04/ssis-fehlerbehebung-bei-excel-destination-schreibt-keine-zeilen/ Unfortunately it is in German]

How to schedule data insertion from dbf to SQL Server on 64-bit Windows Server 2012

I am working on a Windows Server 2012 64-bit. I want to be able to import data from a .dbf file into a SQL Server table. I used the import wizard and it worked correctly. However, I have SQL Server Express and can't schedule this insertion.
Is there another way to schedule the insertion of the .dbf data to the SQL Server tables, without the use of the SSIS package loader?
Update
I ended up using Python and writing a script to import from XML. However, I believe the answer by #Oleg was the most accurate, given the circumstances.
Thank you all!
You can also use DBF Commander Pro for this task:
Create command line for your insertion - choose 'File -> Export to DBMS'. Specify transfer options in the window appears, then copy the command line from the bottom of the window:
Create text .BAT file and insert the copied command line, e.g.:
"c:\Program Files\DBFCommander\DBFCommander.exe" -edb "D:\Data\customer.dbf" customer_table "Provider=SQLOLEDB.1;User ID=user1;Initial Catalog=test_db;Data Source=test_server"
Make a schedule using Windows Scheduler that will execute this .BAT file.
Additional info that may be useful for you:
Using DBF in batch mode
Export DBF file to SQL database
I suggest you the next approach:
Create C# script which will use the OleDbConnection (to fetch) and SqlConnection (to upload) objects to import data from the .DBF file to SQL Server database table.
By using LinqPad, LinqPad command-line utility (lprun.exe) and windows Scheduled Task service automate the execution of the mentioned script file
Useful links:
How to get data from DBF file using C#
How to load data into datadase using C#
About LINQPad command-line utility
Another way is create a SQL linked server an ODBC that is pointing at the DBF. Use Windows scheduler to call SQLCMD.EXE to run some SQL to copy the data in.

Importing tab-delimited text file - can't change the column mappings SQL Server 2008

I have a very large tab-delimited text file I'm trying to import into SQL Server 2008. Some of the field names are greater than 50 characters and when I try to change the column mappings using Management Studio, I'm unable to change the data type (default is varchar) or the size (default is 50). The Edit SQL button is also grayed out. What gives with this?
I am importing the data through the import wizard in SSMS (right click on database name, tasks, import)
Sometimes, if you run the wizard and you get an error, it creates a table. You have to go into SQL, delete the table and then start over. Otherwise, SQL Wizard does not allow you to edit the table once it has been created.

Migrating from Postgres to SQL Server 2008

I need to migrate a database from Postgres 7 to SQL Server 2008. I am familiar with the SSIS Import and Export wizard but I am stumped about how to define the data source or define the data provider.
What is the best way to migrate Postgres to SQL Server, and how do I define data sources/drivers for postgres?
I was having problems using the Import Wizard in SQL Server 2008 R2 to import tables from PostgreSQL. I had the PostgreSQL ODBC driver installed, so for the Data Source in the Import Wizard I chose ".Net Framework Data Provider for Odbc" and supplied the DSN name for my PostgreSQL database. The wizard found the tables okay, but when I went to perform the import I got the error
Column information for the source and destination data could not be retrieved.
“Billing” -> [dbo].[Billing]:
– Cannot find column -1.
I found the solution in the Microsoft blog post here. Apparently the problem is that various ODBC drivers use different attribute names when reporting column metadata. To get the import to work I had to edit the "ProviderDescriptors.xml" file, which was located at
C:\Program Files\Microsoft SQL Server\100\DTS\ProviderDescriptors\ProviderDescriptors.xml
In the ...
<dtm:ProviderDescriptor SourceType="System.Data.Odbc.OdbcConnection">
... element I had to change the attributes from ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "COLUMN_SIZE"
NumericPrecisionColumnName = "COLUMN_SIZE"
NumericScaleColumnName = "DECIMAL_DIGITS"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
... to ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "LENGTH"
NumericPrecisionColumnName = "PRECISION"
NumericScaleColumnName = "SCALE"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
That is, I had to tweak the MaximumLengthColumnName, NumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH", "PRECISION", and "SCALE", respectively.
Once that change was made the import from PostgreSQL to SQL Server ran successfully.
I wish you the best of luck in trying to import from PostgreSQL into SQL Server using SQL Server Import and Export Wizard. However, I have read numerous message board threads with people having trouble getting it to work. For example:
Import Data from Postgresql to SQL Server 08 Error
Here is the most helpful thread that I have found on the topic:
Import data from postgreSQL into SQL server 2005
To help someone who might be trying to achieve similar goal as mine. Instead of selecting the “PostgreSQL OLE DB Provider” in the data source drop down menu of SQL Server Import and Export Wizard, select “.Net Framework Data Provider for Odbc”
Then you have to make a DSN and provide a ConnectionString. Following ConnectionString worked for me
Driver={PostgreSQL};Server=localhost;Port=5432;Database=TestMasterMap;Uid=postgres;Pwd=;
To make a DSN you have to go into Administrative Toolsà Data Sources (ODBC) and create a user DSN. Once this is done you can supply the DSN name in the DSN text box of SQL Server Import and Export Wizard.
One commenter claimed that it worked, but that he got "Out of memory while reading tuples" errors on big tables. So for tables with more than 3 million rows, he had to break the import up into 3 million row chunks.
Also, there's a link to the native .NET provider for PostgreSQL in that thread.
Personally, if this is something that I only had to do once, and if I understood the schema and the data fairly well, I would try:
export the data from PostgreSQL as flat files
create the schema in SQL Server (without PKs or constraints)
use the SSIS Import/Export Wizard to import the flat files
then create PKs and necessary constraints
It might take you less time to do the above than messing with SSIS Import/Export Wizard and PostgreSQL for days (but it would be nice if those tools worked!)
As I finished commenting the answer above, I thought of trying SQL WorkbenchJ; it has a datapump feature that worked pretty well for me. I managed to export data from my PostgreSQL database to an SQL server instance.
Those who'd like to run this in batch mode (via shell), here's how to do it: Google Groups Thread. The WbCopy command mentioned on the discussion isn't really documented anywhere I could find, but you can generate one through the datapump interface and then change whatever you need.
To give a little more practical example of how you can achieve what's described in marked answer; you can export from PostgresQL to flat files then use bcp Utility to import in SQL Server.
e.g. in a .bat file, for a single table (and you need to have the table already created in the destination SQL DB):
#echo off
set DbName=YOUR_POSTGRES_DB_NAME
set csvpath=C:\PATH_TO_CSV\CSV_NAME.csv
set username=YOUR_POSTGRES_DB_USERNAME
:: Export to CSV, note we're using a ~ delimiter to avoid issues with commas in fields
psql -U %username% -d %DbName% -c "COPY (select * from SOURCE_TABLE_NAME) TO STDOUT (FORMAT CSV, HEADER TRUE, DELIMITER '~', ENCODING 'UTF8');" > %csvpath%
:: Import CSV to SQL Server
set logpath=C:\bcplog.txt
set errorlogpath=C:\bcperrors.txt
set sqlserver=YOUR_SQL_SERVER
set sqldb=YOUR_DB_NAME
:: page code 65001 = UTF-8
bcp DESTINATION_TABLE_NAME IN %csvpath% -t~ -F1 -c -C65001 -S %sqlserver% -d %sqldb% -T -o %logpath% -e %errorlogpath%

Resources