Currently, I'm using Bulk Insert statement to read the CSV files and add all rows to the SQL table.
BULK INSERT tablename
FROM 'D:\Import Files\file.csv'
WITH (
FIRSTROW = 2,
FIELDTERMINATOR = ',',
ROWTERMINATOR='0x0a');
But now I have a dynamic mapping for each file and stored in a table (File's Field Name = Database Field Name).
Mapping Table:
FileId FileFieldName DBFieldName
1 Order-Id orderid
1 Order-Date orderdate
2 Id orderid
2 Orderedon orderdate
I want to map the file field name with database fields and import the rows to the SQL table.
How to achieve dynamic mapping with Bulk Insert statement in SQL Server?
Write your dynamic map from the table to an XML file, then use this syntax for BULK INSERT:
BULK INSERT tablename
FROM 'D:\Import Files\file.csv'
WITH (FORMATFILE = 'D:\BCP\myFirstImport.xml');
Related
I have a .txt file which is 6.00 GB. It is a tab-delimited file so when I try to load it into SQL Server, the column delimiter is tab.
I need to load that .txt file into the database, but I don't need all the rows from the 6.00 Gb file. I need to be able to use a condition like
select *
into <my table>
where column5 in ('ab, 'cd')
but this is a text file and am not able to load it into db with that condition.
Can anyone help me with this?
Have you tried with BULK INSERT command? Take a look at this solution:
--Create temporary table
CREATE TABLE #BulkTemporary
(
Id int,
Value varchar(10)
)
--BULK INSERT has no WHERE clause
BULK INSERT #BulkTemporary FROM 'D:\Temp\File.txt'
WITH (FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n')
--Filter results
SELECT * INTO MyTable FROM #BulkTemporary WHERE Value IN ('Row2', 'Row3')
--Drop temporary table
DROP TABLE #BulkTemporary
Hope this helps.
Just do a Bulk Insert into a staging table and form there move the data you actually want into a production table. The Where Clause is for doing something based on a specific condition inside SQL Server, not for loading data into SQL Server.
I am using SQL Server 2012. I have same table on both server with 15 columns.first table has less records than second. I want to copy remaining records from second table to first. Both tables are on different SQL server. So I have created linked server. I was thinking about IF EXISTS. But i want to copy more records. I want to create script for this task. and I will use this frequently using task scheduler.(Note: I don't want to use Replication)
You Can Use Except Operator
For Example
Drop table #aa
Create table #aa (Id Int)
Insert into #aa
Select 1
Drop table #bb
Create table #bb (Id Int)
Insert into #bb
Select 1
Union all
Select 2
Insert into #aa
Select * from #bb
Except
Select * from #aa
My SQL Server 2012 table is CUST_TABLE. It already has a lot of customer records (more than 10,000).
I have a CSV with the first column customer number which is my primary key. The second column has email addresses. The first row of this CSV contains columns heading custnum and email. The CSV has 125 data rows. I am using SSMS and want to update just 125 customer records and change their email.
The only solution I found was to have update statements to change the data. Is there any other easier way to do this? Like using the import data function by right-clicking on the database and then hovering over tasks. Thank you.
Read the csv into a temp table, then update your table using the temp table.
For example:
USE yourdb;
GO
IF OBJECT_ID('tempdb.dbo.#tmp', 'U') IS NOT NULL
DROP TABLE #tmp;
GO
CREATE TABLE #tmp (
t_cust_nr NVARCHAR(MAX),
t_email NVARCHAR(MAX)
)
SET NOCOUNT ON;
-- Read the csv, skip the first row
BULK INSERT #tmp
FROM 'C:\path\to\your.csv'
WITH (FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR ='\n');
-- Trim whitespace
UPDATE #tmp
SET t_cust_nr = LTRIM(RTRIM(t_cust_nr)),
t_email = LTRIM(RTRIM(t_email));
-- Add your update statement here...
-- You also might have to cast the t_cust_nr to a diff. data type if needed.
SET NOCOUNT OFF;
DROP TABLE #tmp;
I have a .txt file which is 6.00 GB. It is a tab-delimited file so when I try to load it into SQL Server, the column delimiter is tab.
I need to load that .txt file into the database, but I don't need all the rows from the 6.00 Gb file. I need to be able to use a condition like
select *
into <my table>
where column5 in ('ab, 'cd')
but this is a text file and am not able to load it into db with that condition.
Can anyone help me with this?
Have you tried with BULK INSERT command? Take a look at this solution:
--Create temporary table
CREATE TABLE #BulkTemporary
(
Id int,
Value varchar(10)
)
--BULK INSERT has no WHERE clause
BULK INSERT #BulkTemporary FROM 'D:\Temp\File.txt'
WITH (FIELDTERMINATOR = '\t', ROWTERMINATOR = '\n')
--Filter results
SELECT * INTO MyTable FROM #BulkTemporary WHERE Value IN ('Row2', 'Row3')
--Drop temporary table
DROP TABLE #BulkTemporary
Hope this helps.
Just do a Bulk Insert into a staging table and form there move the data you actually want into a production table. The Where Clause is for doing something based on a specific condition inside SQL Server, not for loading data into SQL Server.
Example from pic.
http://pic.free.in.th/id/d56133ad2238308e979aa3dbea94436e
i want to insert data into Database A table A and Database B Table B in same time but some column from table A to Table B
EX.
table A have column ID,Name,Address,tel.I want just insert data ID,Name into table B.
(insert data to table B automaticly when insert data to table A )
if you have any idea please let me know.
You can do an insert with a SELECT.
INSERT INTO DataBaseB.dbo.TableB (ID, Name)
SELECT ID, Name from DatabaseA.dbo.TableA
See here for more details:
http://www.w3schools.com/sql/sql_insert_into_select.asp
This assumes both databases are on the same server, if not, you could always export/import the data?
ALTER TRIGGER [dbo].[ticky2]
ON [dbo].[data]
AFTER INSERT
AS
INSERT INTO [testDatabase].[dbo].[pool]
([ID]
,[Name]
,[salary]
,[date])
SELECT ID, Name,salary,date from deconc.dbo.data
where ID not in
(
select ID
from [testDatabase].[dbo].[pool]
)