How to convert a CSV file into bcp formatted file? - sql-server
I'm into a task of importing a CSV file to SQL server table. I'm using bcp tool as my data can be large. The issue im facing with bcp is that the table where I'm gonna import CSV into can have a mix of data types like date, int, etc and if I use bcp using native mode (-n), I will need bcp file as the input but I have CSV file.
Is there any way to convert CSV file into bcp file? or
How can I import a CSV file into SQL server table given that my table columns can have any data type and not just character types?
Had it been that all columns are of character type, i would have used bcp tool with -c option.
Actually... the safest thing to do when importing data, especially when it ins bulk like this, is to import it into a staging table first. In this case where all of the fields are string/varchars. That then allows you to scrub/validate the data and make sure it's safe for consumption. Then once you've verified it, move/copy it to your production tables converting it to the proper type as you go. That's typically what I do when dealing with import data.
a CSV file is just a text file that is delimited by commas. With regard to importing text files, there is no such thing as a 'BCP' file. BCP has an option to work with native SQL data (unreadable to the human eye with a text editor), but the default is to just work with text the same as what you have in your CSV file. There is no conversion needed, with using textual data, there is no such thing as a "BCP file". It's just a ascii text file.
Whoever created the text file has already completed a conversion from their natural datatypes into text. As others have suggested, you will save yourself some pain later if you just load the textual CSV data file you have into a "load" table of all "VARCHAR" fields. Then from that load table you can manipulate the data into whatever datatypes you require in your final destination table. Better to do this than to make SQL do implied conversions by having BCP insert data directly into the final destination table.
Related
Is it possible to inspect a MS SQL Server dump (.dat-binary, exported with bcp)?
I have issues importing a binary .dat-file that has been generated with the cmd-line tool bcp. Besides the unhelpful error messages I get (invalid field size for datatype ; but no mention of which column and or datatype) I have nothing to go with. I did not generate the bcp file, so there is that. It's a large file and I have not encountered any means to see the contents! Is it somehow possible to convert the the binrary .dat file into a flat text .csv file? Is it possible to just extract the first couple of rows into clear text? Thank you for any hints.
BAI2 File needs to be load into SSIS
How can I load BAI2 file to SSIS? .BAI2 is an industry standard format used by the banks. Below is the one truncated example 01,021000021,CST_USER,110520,1610,1627,,,2/ 02,CST_USER,089900137,1,110509,1610,,2/ 03,000000370053368,USD,010,782711622,,,015,7620008 12,,,040,760753198,,/ 88,043,760000052,,,045,760010026,,,050,760000040,, ,055,760000045,,/
Use a Flat file connection manager I think you can import these files using a flat file connection manager, because they are similar to comma separated text, try to change the row delimiter and column delimiter properties to find the appropriate one. From the example you mentioned i think you should use: , as Column delimiter / as Row delimiter To learn more about how to interpret a BAI2 file check the following link: EBS – How to interpret a BAI2 file Based on this link: The BAI2 file is a plain text file (.TXT Format), which contains values / texts one after the other. Because the number of columns is not fixed among all rows than you must use define only one column (DT_STR,4000) in the flat file connection manager, and split columns using a Script Component: SSIS ragged file not recognized CRLF how to check column structure in ssis? SSIS : Creating a flat file with different row formats Helpful links SQL SERVER – Import CSV File into Database Table Using SSIS Importing Flat Files with Inconsistent Formatting Using SSIS SSIS Lesson 2: First Package
SQL Server : bulk insert from csv string stored in column
I store csv strings in a datatable. I later need to create a temporary table from that csv, but BULK INSERT only offers a filename as datasource. Is there any possibility to import from a string? Thank you and regards Gabriel
In general, it is not desirable to store unnormalized CSV data in a SQL Server table. It makes it very hard to query and work with the data. That being said, sometimes we have to live with bad design decisions. That having been said, you could try writing your CSV column to file. From the query menu of SSMS choose SQLCMD mode, and then type the following: :OUT c:\path\to\your\file.csv SET NOCOUNT ON;SELECT csv_column FROM dbo.yourTable Now that you have a bona fide CSV file, you should be able to use BULK INSERT. Note that I have assumed here that the CSV data which you want to import is contained within a single column csv_column, and that the data is well formed (e.g. each record has the same number of commas etc.).
Guessing/determining the column data types for a CSV?
I have the CSV for some time (suppose it's an RFC-compliant CSV with no syntax errors), with a header line containing columnn names. Howevwer, I don't have the intended data type of each column. What would be a quick-and-not-so-dirty way to guess those column types? Notes: Motivation: I want to load the CSV into a DB table, but I have to create the table first. I'm interested in a shell-script'y solution, but other alternatives might be relevant.
You can use SQL loader for loading csv file in a accurate way. If you want to know how to create SQL loader
Importing a CSV File in MongoDB
I am importing a large 9.5 million record file in MongoDB using MongoImport. The CSV file was created for SQL Server Import operations and because the data in SQL table contained quotes(") itself. So, it was generating error importing in SQL.That is why, it has pipe (|) as the text qualifier and as the column delimiter. While using MongoImport I can't find anything related to setting column delimiter and text qualifier by myself. The screenshot contains the error message
You cannot. Since it is using | as delimiter it's not longer valid CSV. Your options are to export data to JSON, TSV or CSV and then use mongoimport or do it programmatically which will be pretty valid and fast option since mongo doesn't wait till it's inserted so it will be pretty fast.