Import flat file into SQL Server - sql-server

I'm trying to import a flat file into SQL Server and I'm having some issues. The column delimiter is ;~ and the row delimiter is |~. I'm using the SQL Server Import and Export Wizard but keep getting errors. Have any of you every had a similar issue? I think I'm doing it wrong from the start of the wizard. Can any of you talk me through the steps. Thanks.
Here is the import error:

It's a familiar-sounding problem, but difficult to be sure without a data file to play with - perhaps one of these posts has your answer:
Text was truncated or one or more characters had no match in the target code page including the primary key in an unpivot
Errors in SQL Server while importing CSV file despite varchar(MAX) being used for each column
SQL Server Import wizard fails with incomprehensible message

Thanks to all of you who responded.
My solution was to first create the tables in SQL Server using the varchar(max) data type for each column and then executing a BULK INSERT statement. Definitely not ideal, but the goal was not to create a database but rather delimit the data so I could upload it to an application. Thanks again.

Related

How to generate Insert statement from PGAdmin4 Tool?

We are writing a new application, and while testing, we will need a bunch of dummy data. I've added that data by using MS Access to dump excel files into the relevant tables into the Postgres database.
What should I do now to generate an Insert statements from the PGAdmin4 Tool similar to what SQL Studio allow us to generate an Insert statements for SQL Server? There are no options available to me. I can't use the closest one, which is to export and import the data via CSV.
I understand that you cannot import the CSV file into the actual DB as this needs to be done through ASP.NET core EF. Perhaps, you can probably create a test schema and import the CSV file into the test schema. Once you have the data imported into the test schema, you can use that to generate SQL statements using the steps below:
Right click on target table and select "Backup".
Select a file path to store the backup. You can save the file name as data.backup
Choose "Plain" as Format.
Open the tab "Options" check "Use Column Inserts".
Click the Backup-button.
Once the file gets generated you can open with Notepad++ or VSCode to get the SQL insert statements
You can use the statements generated and delete the test schema created
Here is a resource that might help you in loading data from Excel file into PostgresSQL if you still need to take this path Transfer Data from Excel to PostgreSQL

Import DBF file into SQL Server : error "The search key was not found in any record"

I'm trying to import an external vendor's ,dbf file into our SQL Server 2016 database for reporting. I've created a linked server and can import from most of the tables except for some which provide the error
The Search key was not found in any record
I'm doing a simple select to test the connection
SELECT * FROM [LINKED_SERVER]...[packet]
If I copy the .DBF file to my computer and use "DBF Viewer 2000" to remove empty records, the select query works fine, so suspect it's something in the file that is corrupted.
I copy the .DBF file to my DB server before I import it. The vendor insists the .DBF file isn't corrupt, so I need to handle this on my side. Is there a way to handle this in SQL? Alternatively, does anyone know of a free command line application that I can use to run a repair and remove empty records?
Thanks
Stephen
The possible cause of this error is space in column names. Carefully check leading space, trailing space and also space inside of the column name.

Errors while importing data from flat file to database table

I got these error message while importing data from flat file to sql server table using import export wizard in ssms :-
The mapping window :-
I know there is a issue regarding datatype mismatch,how should I overcome that?
As per error, it indicate that your flat file is not in proper format, in file some of the row has messing (, or tab) delimiter.

SQL Server import export assistant error when importing time type

I'm trying to import a table from a txt file comma separated, the file was generated by the same SQL Server import/export assistant from my development database server.
Many tables has been imported by this way successfully.
The problem is when I have a column of the type date.
The message is: conversion unknown!
And then I cannot exec the package.
How can I solve this problem?

Migrating from Postgres to SQL Server 2008

I need to migrate a database from Postgres 7 to SQL Server 2008. I am familiar with the SSIS Import and Export wizard but I am stumped about how to define the data source or define the data provider.
What is the best way to migrate Postgres to SQL Server, and how do I define data sources/drivers for postgres?
I was having problems using the Import Wizard in SQL Server 2008 R2 to import tables from PostgreSQL. I had the PostgreSQL ODBC driver installed, so for the Data Source in the Import Wizard I chose ".Net Framework Data Provider for Odbc" and supplied the DSN name for my PostgreSQL database. The wizard found the tables okay, but when I went to perform the import I got the error
Column information for the source and destination data could not be retrieved.
“Billing” -> [dbo].[Billing]:
– Cannot find column -1.
I found the solution in the Microsoft blog post here. Apparently the problem is that various ODBC drivers use different attribute names when reporting column metadata. To get the import to work I had to edit the "ProviderDescriptors.xml" file, which was located at
C:\Program Files\Microsoft SQL Server\100\DTS\ProviderDescriptors\ProviderDescriptors.xml
In the ...
<dtm:ProviderDescriptor SourceType="System.Data.Odbc.OdbcConnection">
... element I had to change the attributes from ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "COLUMN_SIZE"
NumericPrecisionColumnName = "COLUMN_SIZE"
NumericScaleColumnName = "DECIMAL_DIGITS"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
... to ...
<dtm:ColumnSchemaAttributes
NameColumnName = "COLUMN_NAME"
OrdinalPositionColumnName="ORDINAL_POSITION"
DataTypeColumnName = "TYPE_NAME"
MaximumLengthColumnName = "LENGTH"
NumericPrecisionColumnName = "PRECISION"
NumericScaleColumnName = "SCALE"
NullableColumnName="NULLABLE"
NumberOfColumnRestrictions="4"
/>
That is, I had to tweak the MaximumLengthColumnName, NumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH", "PRECISION", and "SCALE", respectively.
Once that change was made the import from PostgreSQL to SQL Server ran successfully.
I wish you the best of luck in trying to import from PostgreSQL into SQL Server using SQL Server Import and Export Wizard. However, I have read numerous message board threads with people having trouble getting it to work. For example:
Import Data from Postgresql to SQL Server 08 Error
Here is the most helpful thread that I have found on the topic:
Import data from postgreSQL into SQL server 2005
To help someone who might be trying to achieve similar goal as mine. Instead of selecting the “PostgreSQL OLE DB Provider” in the data source drop down menu of SQL Server Import and Export Wizard, select “.Net Framework Data Provider for Odbc”
Then you have to make a DSN and provide a ConnectionString. Following ConnectionString worked for me
Driver={PostgreSQL};Server=localhost;Port=5432;Database=TestMasterMap;Uid=postgres;Pwd=;
To make a DSN you have to go into Administrative Toolsà Data Sources (ODBC) and create a user DSN. Once this is done you can supply the DSN name in the DSN text box of SQL Server Import and Export Wizard.
One commenter claimed that it worked, but that he got "Out of memory while reading tuples" errors on big tables. So for tables with more than 3 million rows, he had to break the import up into 3 million row chunks.
Also, there's a link to the native .NET provider for PostgreSQL in that thread.
Personally, if this is something that I only had to do once, and if I understood the schema and the data fairly well, I would try:
export the data from PostgreSQL as flat files
create the schema in SQL Server (without PKs or constraints)
use the SSIS Import/Export Wizard to import the flat files
then create PKs and necessary constraints
It might take you less time to do the above than messing with SSIS Import/Export Wizard and PostgreSQL for days (but it would be nice if those tools worked!)
As I finished commenting the answer above, I thought of trying SQL WorkbenchJ; it has a datapump feature that worked pretty well for me. I managed to export data from my PostgreSQL database to an SQL server instance.
Those who'd like to run this in batch mode (via shell), here's how to do it: Google Groups Thread. The WbCopy command mentioned on the discussion isn't really documented anywhere I could find, but you can generate one through the datapump interface and then change whatever you need.
To give a little more practical example of how you can achieve what's described in marked answer; you can export from PostgresQL to flat files then use bcp Utility to import in SQL Server.
e.g. in a .bat file, for a single table (and you need to have the table already created in the destination SQL DB):
#echo off
set DbName=YOUR_POSTGRES_DB_NAME
set csvpath=C:\PATH_TO_CSV\CSV_NAME.csv
set username=YOUR_POSTGRES_DB_USERNAME
:: Export to CSV, note we're using a ~ delimiter to avoid issues with commas in fields
psql -U %username% -d %DbName% -c "COPY (select * from SOURCE_TABLE_NAME) TO STDOUT (FORMAT CSV, HEADER TRUE, DELIMITER '~', ENCODING 'UTF8');" > %csvpath%
:: Import CSV to SQL Server
set logpath=C:\bcplog.txt
set errorlogpath=C:\bcperrors.txt
set sqlserver=YOUR_SQL_SERVER
set sqldb=YOUR_DB_NAME
:: page code 65001 = UTF-8
bcp DESTINATION_TABLE_NAME IN %csvpath% -t~ -F1 -c -C65001 -S %sqlserver% -d %sqldb% -T -o %logpath% -e %errorlogpath%

Resources