SQL Server : loading flat file from Informix - sql-server

How can I load my text file into a SQL Server database if my flat file doesn't contain any row delimiter?
I tried using openrowset, but it only put the data in 1 column..
Also tried generating format
bcp dbname.owner.tablename format nul -c -f tablename-c.fmt -T
but the output is network related or instance specific error.
And I also can't find the AlwaysCheckForRowDelimiters property.
Are there any other options for loading these types of data?
Text file with only column delimiter, and not fixed length. This is a unload file from an Informix database.

Related

How to export all tables within an Sybase 12.5 database to multiple flat files?

My customers runs an very old (seems to me) Sybase 12.5.2 database. I want/need to export all tables from a database to multiple (for each table) flat (text) files. I have access to ISQL command line prompt with the admin user. I havent worked ever with an Sybase database before.
Sybase Adaptive Server Enterprise (ASE) allows multiple databases to be hosted. You don't specify whether only one of the databases in the database server needs to be exported or if all of them do.
For each database, the following query will list the names of the tables
select name from sysobjects where type = 'U'
Sybase ASE also comes with a tool called "bcp" which stands for "Bulk Copy". It is an easy way of creating a flat file of a table's contents.
bcp database.schema.table out file_name -c -U username -S server_name
It has more options that may be of interest, especially around field and row terminators. Documentation for the most relevant version (12.5.1) can be found here:
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.dc30191_1251/html/utility/BABGCCIC.htm
i have been using BCP commands to export data from sybase environments.bcp is a command line utility which you can use it to export data from multiple types of databases
below is a very example and you can try it for
bcp Table Name out OUTPUT FILE PATH\FILENAME.dat -S SERVER NAME -U USERNAME -P PASSWORD -F Format -r row_terminator -e error output file path and name
You can create a batch file with such commands and do multiple exports on one hit.
If you have access to any ETL tool you can exporting the data using the same as well.

BCP: Unable to use Native formats while exporting

This is how I am using the BCP:
-- Generating format file in native content (-n)
bcp Sales.dbo.Num format nul -n -f D:\Format.fmt -T -S .\sqlexpress
-- Exporting the table using above format file
bcp Sales.dbo.Num out D:\FactNum.csv -f D:\Format.fmt -T -S .\sqlexpress
But whatever I do the output .csv is always gibberish.
P.S.
It's working well if char version of format files (with -c param)
I have tried xml native version of format file but of no help
Gone through this and basic BCP documentation on MSDN
Suggestion please
"gibberish" as you put it, is what you get when you use native (-n) data. The data is written to the file in SQL Server's native format, not ascii text. I believe you want to see ASCII text? If so, that is what the character (-c) option is for.
The use of a format file has nothing to do with the "format" of the file in this regard.
There is the format of the data (set by the -c, -n and other options) and then there is the format of the columns... that is defined by the format file if you so choose.
You are exporting data in native format, but you want it to look like data that is not in SQL native format. SQL native format is not for consumption by human beings... it's made to be read by a SQL Server.

SQL SERVER - reading excel files content and transfer to sql database using xp_cmdshell

I was shocked when I learned that importing the excel data to sql database using OPENROWSET has downsides as it truncates the cells' values of it to 255-length-characters before it passes to the database. I'm now thinking of using xp_cmdshell to read the excel file's data and transfer it to database. however I'm clueless on how I could do that. could somebody help me to achieve that?
Yes BCP could be used to import data from excel(.xlsx) files into Sql Server tables. Only thing to remember here is from MS documentation -
Prerequisite - Save Excel data as text To use the rest of the methods
described on this page - the BULK INSERT statement, the BCP tool, or
Azure Data Factory - first you have to export your Excel data to a
text file.
In Excel, select File | Save As and then select Text (Tab delimited)
(.txt) or CSV (Comma delimited) (.csv) as the destination file type.
A sample BCP command to import data from a excel file (converted to tab delimited) into Sql Server table -
bcp.exe "MyDB_Copy.dbo.Product" in "C:\Users\abhishek\Documents\BCPSample.txt" -c -t"\t" -r"\n" -S ABHISHEK-HP -T -h TABLOCK
Read more about BCP and import here.

redirect the records to different folder if the format is incorrect using SSIS

i have created the SSIS package to load the comma delimited flat file data into the Sql server destination table.To Achieve this i have used Data flow task, flat file source and ole db destination.I have configured the flat file connection manager with comma separator as a column delimiter and row delimiter as a "CRLF".
in this case, package is successfully imported comma separated flat file data into the SQL server destination table.
Now question is, Do we have any option to send any records with Pipe Delimited to an different folder containing only error records.

Sqlcmd script task to validate source file and target table

Let me be very specific with what i need. Problem Statement 1: I have got few text files having numerous records which are delimited by pipeline. These files are there in a remote windows server which are the source file for our daily load into a sql server table. I want to write a scripts which will get the filename from the source file path and the number of records in it.–
Next task is to get the counts from sql server table which is ther in an another server(remote) and then validate the counts of table and file.
Please help.....

Resources