How to bulk insert with derived columns in SQL Server? - sql-server

Newbie to SQL Server!
I try to perform bulk insert into SQL Server. I have the following csv file named input.csv:
NO,Name,age,Reference,dateTime,Category
1,Stack#mail,23,Kiop,2017-03-02T12:23:00,D
2,OverEnd#Yahoo,22,CSK,2017-030-03T12:23:00,I
In that CSV file, I have to move that into SQL Server using BulkInsert into below table schema:
create table BulkInsertTemp
(
no int,
name nvarchar(50),
age int,
Ref nvarchar(30),
currentDatetime datetime,
Category nvarchar(40)
)
Now I need to store in SQL like:
no Name age Ref currentDatetime category
--------------------------------------------------------
1 Stack 23 Kiop 2017-03-02 12:23:00 D
2 OverEnd 22 CSK 2017-03-03 12:23:00 I
I just tried below query for another one table to move into SQL Server.
create table bulkInsert(no varchar(50),name varchar(50));
BULK INSERT bulkInsert
FROM 'C:\MyInput\BulkInsert\BulkInsertData.txt'
WITH
(FIRSTROW = 1,
ROWTERMINATOR = '\n',
FIELDTERMINATOR = ',',
ROWS_PER_BATCH = 10000)
My query worked if there is no need for modifying data.
But in input.csv I have to change column values such as if name is "Stack#mail" to be store as "Stack" in SQL
I am a new one to bulk insert option so I don't know how to derive columns from existing ones.
Anyone, please guide me to solve my requirement?

I would recommend building an SSIS package to do this. If you don't know how or don't have time you could run the SQL Server Import and Export Wizard which will actually create an SSIS package for you behind the scenes.
Hope that gets you going in the right direction.
Noel

Related

Inserting data from excel sheet into sql temp table

I have created the following temp table in SQL Server Management Studio:
CREATE TABLE ##LoginMap
(
ObjectId NVARCHAR(50),
UserPrincipleName NVARCHAR(500),
Username NVARCHAR(250),
Email NVARCHAR(500),
Name NVARCHAR(250)
)
These are the same names as the column headers in the excel sheet I am trying to get the data from.
The file name is called Test Loadsheet and the sheet name is AgilityExport_04Aug2022_164839.
I am trying to insert the data into the temp table like so:
INSERT INTO ##LoginMap
SELECT *
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0; Database=C:\temp\Test LoadSheet.xlsx', [AgilityExport_04Aug2022_164839]);
GO
However, I am getting the error:
The OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" does not contain the table "AgilityExport_04Aug2022_164839". The table either does not exist or the current user does not have permissions on that table.
Where have I gone wrong with this? And what do I need to do in order to successfully get the data from each column into my table?
You have file name as Test Loadsheet in one spot, but then in your query you have it as Test LoadSheet.xlsx. Try and see if this is holding it up.
Found a link on importing data from excel to SQL if you are interested:
https://learn.microsoft.com/en-us/sql/relational-databases/import-export/import-data-from-excel-to-sql?view=sql-server-ver16#openrowset
I went about this a different way. In the excel sheet I am using a made a formula like so:
="INSERT INTO ##LoginMap(ObjectId, DisplayName, AzureEmail, AzureUsername) VALUES('"&A2&"', '"&B2&"', '"&C2&"', '"&D2&"');"
I then had this repeated for each row. This then gave me an insert statement for each that I could simply copy and paste into SSMS and allow to run the query

Import text file data into SQL Server database

I have a text file with below format
I want to import it in SQL Server database by splitting it into several columns:
Terminal, NetSales, NetAjustment, CancelsCnt, CancelAmount,
CashesCnt, CashesAmount, ClaimsCnt, ClaimsAmount, SalesCommission,
CashCommission, NetDue
I have tried to insert the text file in SQL Server using SSIS but its inserting everything in one column instead of split it, then used SQL scripting to split it into several columns but its not working
I'm having some difficulties to split the column from the text file
Any ideas or help about how I can capture those columns data into a proper format?
I would suggest to use SSIS Bulk Insert Task.
Bulk Insert Task in SSIS
It has identical functionality as a T-SQL BULK INSERT statement.
It allows to specify where real first row starts via its FIRSTROW parameter.
Here is a conceptual example.
SQL
CREATE TABLE dbo.tbl (
Terminal VARCHAR(20),
NetSales VARCHAR(30),
NetAjustment VARCHAR(100),
CancelsCnt INT
...
);
BULK INSERT dbo.tbl
FROM 'e:\Temp\inputFile.csv'
WITH (FORMAT='CSV'
, DATAFILETYPE = 'char' -- { 'char' | 'native' | 'widechar' | 'widenative' }
, FIELDTERMINATOR = '\t' -- for a TAB
, ROWTERMINATOR = '\n'
, FIRSTROW = 8
, CODEPAGE = '65001');
-- test
SELECT * FROM dbo.tbl;

How to speed up tables transfer between Access and SQL Server using VBA?

I am trying to move tables from access to SQL Server programmatically.
I have some limitation in the system permissions, ie: I cannot use OPENDATASOURCE or OPENROWSET.
What I want to achieve is to transfer some table from Access to SQL Server and then work on that tables through vba (excel)/python and T-SQL.
The problem is in the timing that it is required to move the tables.
My current process is:
I work with vba macros, importing data from excel and making same transformation in access, to then import into the SQL Server
destroy the table in the server: "DROP TABLE"
re-importing the table with DoCmd.TransferDatabase
What I have notice is that the operation seems to be done based on a batch of rows and not directly. It is taking 1 minutes and half each 1000 rows. The same operation on Access it would have taken few seconds.
I understood that it is a specific way of SQL Server to use import by batches of 10 rows, probably to have more access on data: Micorsoft details
But in the above process I just want a copy the table from access to the SQL as fast as possible as then I would avoid cross platform links and I will perform operation only on the SQL Server.
Which would be the faster way to achieve this goal?
Why are functions like OPENDATASOURCE or OPENROWSET are blocked? Do you work in a bank?
I can't say for sure which solution is the absoute fastest, but you may want to consider exporting all Access tables as separate CSV files (or Excel files), and then run a small script to load each of those files into SQL Server.
Here is some VBA code that saves separate tables as separate files.
Dim obj As AccessObject, dbs As Object
Set dbs = Application.CurrentData
For Each obj In dbs.AllTables
If Left(obj.Name, 4) <> "MSys" Then
DoCmd.TransferText acExportDelim, , obj.Name, obj.Name & ".csv", True
DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, obj.Name, obj.Name & ".xls", True
End If
Next obj
Now, you can very easily, and very quickly, load CSV files into SQL Server using Bulk Insert.
Create TestTable
USE TestData
GO
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
--Check the content of the table.
SELECT *
FROM CSVTest
GO
--Drop the table to clean up database.
DROP TABLE CSVTest
GO
https://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Also, you may want to consider one of these options.
https://www.online-tech-tips.com/ms-office-tips/ms-access-to-sql-database/
https://support.office.com/en-us/article/move-access-data-to-a-sql-server-database-by-using-the-upsizing-wizard-5d74c0df-c8cd-4867-8d07-e6e759d72924

How to import data from CSV into a SQL Server 2008 table?

I have a file which is fill of million of records and it looks like below:
20 Apr 2016 21:50:01,022 - username,,ResetGUI,Account,finished,8182819710127A590BAF3E5DFD9AE8B0.tomcat32,7
20 Apr 2016 21:50:01,516 - username,12345678,loadaccount,AccountAdmin,Starts,40A5D891F145C785CD03596AAD07209F.tomcat32,
I want to automate importing the data into a table.
I am not sure how it works?
Please advise!
If it is a one time data load you can use the SQL Server Import Export Wizard.
Right Click your DB in Management Studio -> Tasks -> Import Data
The wizard will guide you through selecting a database to import to, and a data source (other DB or flat file).
Make sure you create csvtest table first. Make all cols as varchar(200) first to test import
If you can give me col names ill construct a create table script for u. I just need to know your table name. Col names and data types ( from source ).
If u plan to regurly import this file Process can be...
If table exists truncate it.
If table does not exist create one.
Bulk load csv into new table
Anyways heres how to import into existing table from csv
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO

Easiest way to import CSV into SQl Server 2005

I have several files about 5k each of CSV data I need to import into SQL Server 2005.
This used to be simple with DTS. I tried to use SSIS previously and it seemed to be about 10x as much effort and I eventually gave up.
What would be the simplest way to import the csv data into sql server? Ideally, the tool or method would create the table as well, since there are about 150 fields in it, this would simplify things.
Sometimes with this data, there will be 1 or 2 rows that may need to be manually modified because they are not importing correctly.
try this:
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
here is a summary of the code from the link:
Create Table:
CREATE TABLE CSVTest
(ID INT,
FirstName VARCHAR(40),
LastName VARCHAR(40),
BirthDate SMALLDATETIME)
GO
import data:
BULK
INSERT CSVTest
FROM 'c:\csvtest.txt'
WITH
(
FIELDTERMINATOR = ','
,ROWTERMINATOR = '\n'
--,FIRSTROW = 2
--,MAXERRORS = 0
)
GO
use the content of the table:
SELECT *
FROM CSVTest
GO

Resources