Bulk insert with many files inside folder - sql-server

I want to read xml files with sql server. I show below how I do it.
DECLARE #testxml TABLE (IntCol int, XmlCol xml);
INSERT INTO #testxml(XmlCol)
SELECT * FROM OPENROWSET(
BULK 'C:\XMLs\32056963_0001515351.xml',
SINGLE_BLOB) AS x;
SELECT * FROM #testxml
All is ok. But I need to read many files inside a folder, so I'm using:
EXEC master.sys.xp_dirtree 'C:\XMLs\',0,1;
But how can I doing a dynamic bulk insert in order to insert all xml files in the folder to #testxml?

I don't know if there is some way to do a bulk insert of all the files at once. I would suggest to execute your import query for each file, using dynamic queries. But in order to be able to fetch the data from the main query, you should insert the data in a temporary table, because the scope of the table variable will be limited to the dynamic query.
-- Get the file names
CREATE TABLE #files (
subdirectory NVARCHAR(255),
depth INT,
file BIT
)
INSERT INTO #files
EXEC master.sys.xp_dirtree 'C:\XMLs\',0,1;
-- Iterate through the XML files
DECLARE #filesCursor CURSOR;
SET #filesCursor = CURSOR FOR
SELECT subdirectory
FROM #files
WHERE file=1 AND LEN(subdirectory)>4 AND LOWER(RIGHT(subdirectory,4))='.xml'
DECLARE #fileName NVARCHAR(255), #query NVARCHAR(MAX);
FETCH NEXT FROM #filesCursor INTO #fileName;
-- Temporary table to store the data
CREATE TABLE #testxml (IntCol int, XmlCol xml);
WHILE ##fetch_status = 0
BEGIN
-- Build and execute the query for each file
SET #query = 'INSERT INTO #testxml(XmlCol) SELECT * FROM OPENROWSET(BULK ''C:\XMLs\' + #fileName + ''',SINGLE_BLOB) AS x';
EXECUTE sp_executesql #query;
FETCH NEXT FROM #filesCursor INTO #fileName;
END
-- Closing and deallocating cursor
CLOSE #filesCursor;
DEALLOCATE #filesCursor;
-- Get the data from the temp table into your table variable.
-- If it is not necessary to use a table variable, you could read
-- the data directly from the temp table
DECLARE #testxml TABLE (IntCol int, XmlCol xml);
INSERT INTO #testxml
SELECT * FROM #testxml;
-- Deleting temp tables, as they won't be used anymore
DROP TABLE #testxml;
DROP TABLE #files;

Related

Dynamic bulkinsert of mutiple csv files from different location folders

I have multiple csv files in different location folders. I want to do bulk-insert in SQL server dynamically which will do bulk insert in a single table.
I did it for a single CSV file. Can someone help me out?
Here's something to get you started. You can read up on xp_dirtree and cursors to see how they work. If your files are spread across different parent folders, or different drives, you'll need an additional cursor to go get them...
---------------------------------------------------------------------------------------------------------------
--Set some variables
---------------------------------------------------------------------------------------------------------------
DECLARE #fileLocation VARCHAR(128) = '\\server\e$\data\' --location of files (parent folder)
DECLARE #sql NVARCHAR(4000) --dynamic sql variable
DECLARE #fileName VARCHAR(128) --full file name variable if you want to use this
---------------------------------------------------------------------------------------------------------------
--Get a list of all the file names in the directory
---------------------------------------------------------------------------------------------------------------
IF OBJECT_ID('tempdb..#FileNames') IS NOT NULL DROP TABLE #FileNames
CREATE TABLE #FileNames (
id int IDENTITY(1,1)
,subdirectory nvarchar(512)
,depth int
,isfile bit)
INSERT #FileNames (subdirectory,depth,isfile)
EXEC xp_dirtree #fileLocation, 1, 1
--Here's all the files and folders. Note isFile field.
select * from #FileNames
---------------------------------------------------------------------------------------------------------------
--Create a cursor to fetch the file names
---------------------------------------------------------------------------------------------------------------
DECLARE c CURSOR FOR
select name from #FileNames where isfile = 1
OPEN c
FETCH NEXT FROM c INTO #fileName
---------------------------------------------------------------------------------------------------------------
--For each file, bulk insert to the proper view, update the proper table, update the log, etc...
---------------------------------------------------------------------------------------------------------------
WHILE ##FETCH_STATUS = 0
BEGIN
--do your bulk insert work
FETCH NEXT FROM c INTO #fileName
END

Use Temp Table to merge query results from different DBs

I need to extract data from different DBs into a single table. These DBs are all in the same Server and Instance and they have the same structure. One of the columns will be DB Name, the others come from the same table.
I could write a query that extracts these data with a table for each database, but I would like to merge all results into a single table.
I tried to use a temp table to save the single results but the result is an empty table. It seems that the table #tmpTable is emptied after each query. I post here my attempt:
CREATE TABLE [dbo].#tmpTable ([DbName] VARCHAR(MAX), [Example] VARCHAR(MAX))
EXECUTE sp_MSForEachDB
'USE ?;
DECLARE #ExampleQuery AS NVARCHAR(MAX) =
''SELECT DB_NAME() AS [DbName], [Example]
INTO #tmpTable
FROM [tConfig]''
EXEC sp_executesql #ExampleQuery;'
SELECT * FROM #tmpTable
DROP TABLE #tmpTable
The actual query is more complex and it's using PIVOT and other commands, but I think this example is enough if someone knows how to get the wanted result.
CREATE TABLE [dbo].#tmpTable ([DbName] VARCHAR(MAX))
EXECUTE sp_MSForEachDB
'USE ?;
DECLARE #ExampleQuery AS NVARCHAR(MAX) =
''INSERT INTO #tmpTable SELECT DB_NAME() AS [DbName] ''
EXEC sp_executesql #ExampleQuery;'
SELECT * FROM #tmpTable
DROP TABLE #tmpTable

Why doesn't this alter after insert statement work?

I have a stored procedure with dynamic sql that i have embedded as below:
delete from #temp_table
begin tran
set #sql = 'select * into #temp_table from sometable'
exec (#sql)
commit tran
begin
set #sql = 'alter table #temp_table add column1 float'
exec(#sql)
end
update #temp_table
set column1 = column1*100
select *
into Primary_Table
from #temp_table
However, I noticed that all the statements work but the alter does not. When run the procedure, I get an error message: "Invalid Column name column1"
What am I doing wrong here?
EDIT: Realized I didn't mention that the first insert is a dynamic sql as well. Updated it.
Alternate approach tried but throws same error:
delete from #temp_table
begin tran
set #sql = 'select * into #temp_table from sometable'
exec (#sql)
commit tran
alter table #temp_table add column1 float
update #temp_table set column1 = column1*100
Local temporary tables exhibit something like dynamic scope. When you create a local temporary table inside a call to exec it goes out of scope and existence on the return from exec.
EXEC (N'create table #x (c int)')
GO
SELECT * FROM #x
Msg 208, Level 16, State 0, Line 4
Invalid object name '#x'.
The select is parsed after the dynamic SQL to create #x is ran. But #x is not there because dropped on exit from exec.
Update
Depending on the situation there are different ways to work around the issue.
Put everything into the same string:
DECLARE #Sql NVARCHAR(MAX) = N'SELECT 1 AS source INTO #table_name;
ALTER TABLE #table_name ADD TARGET float;
UPDATE #table_name SET Target = 100 * source;';
EXEC (#Sql);
Create the table ahead of the dynamic sql that populates it.
CREATE TABLE #table_name (source INT);
EXEC (N'insert into #table_name (source) select 1;');
ALTER TABLE #table_name ADD target FLOAT;
UPDATE #table_name SET target = 100 * source;
In this option, the alter table statement can be removed by adding the additional column to the create table statement.' Note also that the alter table and update statements could be in separate invocations of dynamic SQL, if that was beneficial to your context.
1) It should be ALTER TABLE #temp... Not ALTER #temp.
2) Even if #1 weren't an issue, you're adding column1, as a NULLable column with no default value and, in the next statement setting it's value to itself * 100...
NULL * 100 = NULL
3) Why are you using dynamic sql to alter the #temp table? It can just as easily be done with a regular ALTER TABLE script... or, better yet, can be included in the original table definition.
This is because the #temp_table reference in the outer batch is a different temp table than the one created in dynamic SQL. Consider:
use tempdb
drop table if exists sometable
drop table if exists #temp_table
go
create table sometable(id int, a int)
create table #temp_table(id int, b int)
exec( 'select * into #temp_table from sometable; select * from #temp_table;' )
select * from #temp_table
Outputs
id a
----------- -----------
(0 rows affected)
id b
----------- -----------
(0 rows affected)
A temp table created in a nested batch is scoped to the nested batch and automatically dropped after. A "nested batch" is either a dynamic SQL query or a stored procedure. This behavior is explained here CREATE TABLE, but it only mentions stored procedures. Dynamic SQL behaves the same.
If you create the temp table in a top level batch, you can access it in dynamic SQL, you just can't create a new temp table in dynamic SQL and see it in the outer batch or in subsequent same-level dynamic SQL. So try to use INSERT INTO instead of SELECT INTO.

SQL Server: Importing and archiving weekly data

Any ideas/suggestions appreciated....
I've been asked to come up with a simple way to import new data we receive from an outside vendor (text files). We get several text files and each needs to be imported into its own table. Some tables have to have the current/existing data moved into a table called TABLENAME_Previous (to work with various existing reports), then have the current table emptied out and the new data imported into it. Also, any data now in the "previous" table has to be appended to an archive table.
Here's an example:
customer.txt comes in from vendor....
First we move the contents of customers_previous to customers_arch
Next we move the contents of customers to customers_previous
Finally we import the new customers.txt file into the table customers
Has anyone ever written a SQL routine to do this, or knows where to find one, that wouldn't be too painful to modify?
Thanks
you may try something like this:
To copy your previous data to Archive
Insert into customers_arch select * from customers_previous
To Copy your Customer Data to Previous:
truncate table customers_previous;
insert into customers_previous select * from customers
Then to Load you text file use Bulk Insert to load your customer table after clearing it.
truncate table customers;
bulk insert customers
from 'd:\yourfolder\customers.txt'
WITH
(
FIELDTERMINATOR =',',
ROWTERMINATOR ='\n'
);
UPDATE:
Ok, Brian, to answer your other question, How to run it for multiple files saved in your WeeklyTable.
Suppose your WeeklyTable is like this:
Declare #WeeklyTable TABLE(ID int Identity(1,1), [FileName] varchar(50))
insert into #WeeklyTable Values
('Customers'),('Orders'), ('Order_Details')
You can create a dynamic query to run your script for each file.
Declare #Template varchar(max)
Set #Template = '
-- Start of [[FILENAME]] --------------------
Insert into [FILENAME]_arch select * from [FILENAME]_previous
GO
truncate table [FILENAME]_previous;
insert into [FILENAME]_previous select * from [FILENAME]
GO
truncate table [FILENAME];
bulk insert [FILENAME]
from ''d:\yourfolder\[FILENAME].txt''
WITH
(
FIELDTERMINATOR ='','',
ROWTERMINATOR =''\n''
);
'
Declare #s varchar(max)
Declare #FileName varchar(50)
Declare #ID int =0
Select TOP 1 #ID=ID, #FileName=[FileName] From #WeeklyTable Where ID>#ID order by ID
While ##ROWCOUNT>0 Begin
Set #s = REPLACE(#Template, '[FILENAME]', #FileName)
Print #s
-- EXEC(#s) -- Uncomment to EXEC the script.
Select TOP 1 #ID=ID, #FileName=[FileName] From #WeeklyTable Where ID>#ID order by ID
End

INSERT INTO #TABLE EXEC #query with SQL Server 2000

Is it true that SQL Server 2000, you can not insert into a table variable using exec?
I tried this script and got an error message:
EXECUTE cannot be used as a source when inserting into a table variable.
declare #tmp TABLE (code varchar(50), mount money)
DECLARE #q nvarchar(4000)
SET #q = 'SELECT coa_code, amount FROM T_Ledger_detail'
INSERT INTO #tmp (code, mount)
EXEC sp_executesql (#q)
SELECT * from #tmp
If that true, what should I do?
N.B. - this question and answer relate to the 2000 version of SQL Server. In later versions, the restriction on INSERT INTO #table_variable ... EXEC ... were lifted and so it doesn't apply for those later versions.
You'll have to switch to a temp table:
CREATE TABLE #tmp (code varchar(50), mount money)
DECLARE #q nvarchar(4000)
SET #q = 'SELECT coa_code, amount FROM T_Ledger_detail'
INSERT INTO #tmp (code, mount)
EXEC sp_executesql (#q)
SELECT * from #tmp
From the documentation:
A table variable behaves like a local variable. It has a well-defined scope, which is the function, stored procedure, or batch in which it is declared.
Within its scope, a table variable may be used like a regular table. It may be applied anywhere a table or table expression is used in SELECT, INSERT, UPDATE, and DELETE statements. However, table may not be used in the following statements:
INSERT INTO table_variable EXEC stored_procedure
SELECT select_list INTO table_variable statements.
The documentation is misleading.
I have the following code running in production
DECLARE #table TABLE (UserID varchar(100))
DECLARE #sql varchar(1000)
SET #sql = 'spSelUserIDList'
/* Will also work
SET #sql = 'SELECT UserID FROM UserTable'
*/
INSERT INTO #table
EXEC(#sql)
SELECT * FROM #table
DECLARE #q nvarchar(4000)
SET #q = 'DECLARE #tmp TABLE (code VARCHAR(50), mount MONEY)
INSERT INTO #tmp
(
code,
mount
)
SELECT coa_code,
amount
FROM T_Ledger_detail
SELECT *
FROM #tmp'
EXEC sp_executesql #q
If you want in dynamic query

Resources