I'm trying to import a very big .CSV file (about 2 GB) as a table into a database in SQL Server (through SQL Server Management Studio).
I tried doing it with the "Import Flat File..." and the "BULK Insert" in the query console as well, both take an extremely long time and I stopped the execution, or it just fails by itself at the end.
BULK INSERT [table name]
FROM 'Path of the CSV'
WITH (FIRSTROW = 1,
FIELDTERMINATOR = ',',
ROWTERMINATIR = '0x0A')
Maybe I'm doing something wrong?
I couldn't find any solution except splitting the data into smaller files but I wanted to avoid doing that.
Thank you.
Related
Is there a way to bulk load data from Azure Blob to SQL Server? I'm accessing the files through Azure Storage Explorer. I'm pretty sure it can be done, but I'm getting an error that says the 'file could not be opened'. Here is my code.
DECLARE #cmd varchar(1000)
SET #cmd = 'BULK INSERT [dbo].[dest_table]
FROM ''alldata/2019/06/29/BB/dds_id.out.20190629.gz''
WITH ( FIELDTERMINATOR = ''\n'',
FIRSTROW = 46,
ROWTERMINATOR = '''+CHAR(10)+''')';
PRINT #cmd
EXEC(#cmd)
The file ends in .gz, so it's compressed. I think that's the problem here. Can someone please confirm. More importantly, is there a workaround for this? All I have is SQL Server; no access to SSIS.
The compressed file is not a data file.
The document BULK INSERT (Transact-SQL):
Imports a data file into a database table or view in a user-specified format in SQL Server.
The argument data_file :
Is the full path of the data file that contains data to import into the specified table or view. BULK INSERT can import data from a disk (including network, floppy disk, hard disk, and so on).
One way is that you uncompress .gz file firstly, and then using BULK INSERT to load the data file to SQL server.
There is another way you can try it using Azure Data Factory. Data Factory can help you uncompress the .gz file in dataset settings:
Reference this tutorial: Copy data from Azure Blob storage to a SQL database by using Azure Data Factory.
Hope this helps.
I want to know how to insert value in SQL Server database with the flat file source in SSIS using SQL command. I've done inserting it using table view, now i have to insert it using SQL command
Well you need a good query to set into a Execute SQL Task in SSIS
you can get help for queries in the site below
----here is the link ----
well you can parametrize the query in Execute SQl Task of SSIS
BCP
This is one of the options that is mostly widely used. One reason for this is that it has been around for awhile, so DBAs have come quite familiar with this command. This command allows you to both import and export data, but is primarily used for text data formats. In addition, this command is generally run from a Windows command prompt, but could also be called from a stored procedure by using xp_cmdshell or called from a SSIS package.
Here is a simple command for importing data from file C:\ImportData.txt into table dbo.ImportTest.
bcp dbo.ImportTest in 'C:\ImportData.txt' -T -SserverName\instanceName
BULK INSERT
This command is a T-SQL command that allows you to import data directly from within SQL Server by using T-SQL. This command imports data from file C:\ImportData.txt into table dbo.ImportTest.
BULK INSERT dbo.ImportTest
FROM 'C:\ImportData.txt'
WITH ( FIELDTERMINATOR =',', FIRSTROW = 2 )
Forgot to say that u can write a select query too with the samples in a OLEDB Source Using Sql Command
I have a SSIS package. Inside it, are two Execute Package Task elements. The two packages are linked to run in order.
Package One - downloads CSV file from external site (SalesForce.com)
E:\SalesForce\MSSQL_Import\PricebookEntry.csv
E: is a mapped network drive.
Package Two - SQL task to BULK INSERT contents of CSV file into empty table.
BULK INSERT SalesForce.dbo.PriceBookEntry
FROM 'E:\SalesForce\MSSQL_Import\PricebookEntry.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n',
CODEPAGE = 'ACP',
ERRORFILE = 'E:\SalesForce\Data_Loader\Logs\Error\Bulk_Insert\BI_PriceBookEntry.csv',
TABLOCK
)
I am getting the following error when the second SSIS package is triggered:
[Execute SQL Task] Error: Executing the query "BULK INSERT
SalesForce.dbo.PriceBookEntry
FROM..." failed with the following error: "Cannot bulk load. The file "E:\SalesForce\MSSQL_Import\PricebookEntry.csv" does not exist.".
Possible failure reasons: Problems with the query, "ResultSet"
property not set correctly, parameters not set correctly, or
connection not established correctly.
I have checked the location of the file. The path and spelling are good. The file exists while the script runs.
I have tried saving the file on the C: just to check. The BULK INSERT package still fails.
I have run the BULK query in management studio as a command and it works fine.
If I chain the packages together inside another package, the BULK INSERT fails. If I run the SSIS packages separately in order, they work fine.
I have other SSIS packages that do the same thing and work fine.
I've been Googling high and low for a possible solution and still nothing. Anyone have any ideas on what might be the issue?
I found the problem.
The quick answer is that the CSV file being downloaded is not completed before the BULK INSERT command is fired in SSIS. Thus SQL Server can't get access to the file.
Previous jobs were only working with very small files (2000 or so lines). This job is working with a lot more (~1 mil lines).
The solution will involve holding off the BULK INSERT until the file has completed download.
Basically what the title is saying.
Today when I use bulk insert with T-SQL on Microsoft SQL Server 2008 to get data from a local drive I use the following query,
BULK INSERT tmp_table
FROM 'c:\data\x.csv'
WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '0x0a' );
Which works ok. But now I want to be able to read .csv data files from a folder on a FTP server.
As every other FTP server, I need username\password to log in first, and then, I somehow need to fetch the files. Is this possible with T-SQL?
I do know that with C# this would be a piece of cake for me, but want to learn this by using T-SQL.
I will also need to know how I can dynamically get the names of files from a given folder on the FTP server, but since I'm taking this one step at a time, you don't need to answer this right away.
When the files are on local drive i am able to use xp_dirtree to get the names of all files in a folder.
An excellent guide can be found here http://www.patrickkeisler.com/2012/11/how-to-use-xpdirtree-to-list-all-files.html
I have a bunch of Excel files that I want to import into a SQL Server database using a script.
I have imported one of the files (BasicCompanyData-2015-02-01-part1_5.csv) using the import wizard to setup the table correctly and then run the
delete * from dbo.temp
But when I run the following query I get the error:
Bulk load data conversion error(type mismatch or invalid character for
the specified codepage)
My SQL code:
BULK INSERT [UK DATABASE 2014].dbo.temp
FROM 'D:\The Data Warehouse.co.uk\BasicCompanyData-2015-02-01-part1_5.csv'
WITH
(
FIELDTERMINATOR=',',
ROWTERMINATOR='\n',
FIRSTROW=2
)
Any ideas for what I am doing wrong?