Importing .csv files into Datagrip? - sql-server

I just tested importing 1321 records (one int column as a key, two text columns as nvarchar(100)) to a MS SQL server.
In Navicat this took me 7 seconds to create with the import wizard
In Datagrip it took 280ms per row (370 seconds). The method I chose to do this was:
1) Open .csv file
2) Use the SQL Insert and Data Extractor options
3) Rename MY_TABLE to the appropriate name (this caused lag on my system with 16gigs of RAM)
4) Control A and then execute
I saw it inserting each row one at a time. This is a simple lookup table. After this I am planning on importing records from 2014 until present (I am creating a new database) which consists of several million rows. Am I inserting .csv files incorrectly? What options do I have here?

Context menu of datasource you want to import to
Choose Import from File....
Customize the table that will be created, check the preview and press OK.

Related

Export table into multiple files using DTS, setup wizard doesn't remember past entries

I have a table that needs to be split up into say 30 files.
The table already has the FileNumber column so it is easy to know which rows should be exported to while file.
Users
-UserID
-FileNumber
-...
So to get file#1 i would select:
SELECT * FROM USERS WHERE FileNumber=1
For file#2
SELECT * FROM USERS WHERE FileNumber=2
The problem is that the SQL Server Import and Export Wizard doesn't allow me to run it multiple times with the same settings. I have to go through the entire wizard, enter my credentials, query etc. for each file.
Is there a faster way to do this?

How to import only the NEW records during Excel-import from SQL

I am currently receiving a push-feed with data into a SQL Server Express database. The feed is updated randomly several times a minute (pushed into the database). I want to import the new updated records into Excel by making a call every three seconds from Excel to the SQL Server database.
I discovered that there are three methods to import the data from SQL Server into Excel:
https://www.excel-sql-server.com/excel-sql-server-import-export-using-vba.htm
import using QueryTable
import using ADO
import using Add-In
Although with these methods the complete table is imported from the database every time. I only want to import the new added records since the last import since the SQL Server database becomes very large.
I have two questions:
1) How can I import only the new added records?
2) What is the most efficient method regarding speed and system-load from the above three import methods?
Thank you!
I do not think it is correct to say Import to Excel. More correctly would be to say Export from SQL Server to Excel.
There can be few ways to do this. Will rate them from easiest/fastest to complicated/slowest:
Easiest one: You add a new column to your SQL table as a "Exported flag" and mark it for all exported columns. So, next time you will be able to export the newest records where value in that column is empty. That column can have type "BIT", "TINYINT" or even "DATETIME" to indicate when that record was exported. The problem with that method only in the case when you are not allowed to add any new columns to the table. Here is sample code to Update flag and extract values in one transaction:
UPDATE #tbl_TMP
SET Flag = 1
OUTPUT DELETED.A, DELETED.B, DELETED.C
WHERE Flag = 0
If you are not allowed to add new column you might use existing columns in the table. For instance if you have any Time stamp or Incremental value in the table you can record the latest extracted value and all records bigger than stored value will be "newest records to extract".
If table does not have Time stamp or Incremental values you can
create another table, which will contain list of already extracted
key values from the table. Then by linking these tables you can
figure out which records are new.
If table does not have key value you can store HASH of all or several columns in another table and then link by that HASH to figure out the new ones.
Create a linked server to the Excel data.
Put an INSERT trigger on the SQL Server table to insert all new rows to the spreadsheet

Import multiple Excel files to create 1 table in SQL Server 2012

I have a few experience in using SQL Server 2012.
All I know to import a excel to database is like the following:
open SQL Server Management Studio
right click on the "table" folder -> Tasks -> Import Data
set data source to MS Excel.
It seems that only one Excel is entertained.
But I want to concatenate 6 Excel files (all with same column layouts) to form a single table in SQL Server.
P.S. No need to tell me to concatenate the Excel file manually by copy and paste because each individual Excel file has about ~50,000 records.
Any ideas / solutions by using sql scripts or any other programming methods?
Thanks a lot.
There's a range of ways to do this, but I'll give you the simplest that comes to mind without requiring any deep technical knowledge on your part.
Given that you're using the wizard, firstly on the 'Select Table Sources and Views' page, change the 'Destination' to be the name of the table you've previously created.
Then, under the 'Edit Mappings' menu when selecting your sheets, ensure you have 'Append rows to the destination table' selected, rather than Create/Delete. Within reason, this will achieve your goal.
There is a risk in flat file loading like this that SQL Server will create your table with unsuitable types (i.e. a Column is a text column, but only contained numbers on the first file - so the column was created as an INT and wont accept any other files). You'll need to create the tables from scratch again with the right structure or work with the mappings page to do this work.
Another way for the semi-technical type is as long as the data is equivelent between files, you can simply do your imports into a series of seperate tables:
Table1
Table2
Table3
...
Then do a
INSERT INTO Table1
SELECT * FROM Table2
UNION ALL
SELECT * FROM Table3
... Add tables here
You can then use DROP TABLE to remove the extras.

Import textfiles into linked SQL Server tables in Access

I have an Access (2010) database (front-end) with linked SQL Server-tables (back-end). And I need to import text files into these tables. These text files are very large (some have more than 200.000 records, and approx. 20 fields)
The problem is that I can't import the tex tfiles directly in the SQL tables. Some files contain empty lines at the start, and some other lines that I don't want to import in the tables.
So here's what I did in my Access database:
1) I created a link to the text files.
2) I also have a link to the SQL Server tables
3a) I created an Append-query that copies the records from the linked text file to the linked SQLServer table.
3b) I created a VBA-code that opens both tables, and copies the records from the text file in the SQL Server-table, record for record. (I tried it in different ways: with DAO and ADODB).
[Step 3a and 3b are two different ways how I tried to import the data. I use one of them, not both. I prefer option 3b, because I can run a counter in statusbar to see how many records needs to be imported at any moment; I can see how far he is.]
The problem is that it takes a lot of time to run it... and I mean a LOT of time: 3 hours for a file with 70.000 records and 20 fields!
When I do the same with an Access-table (from TXT to Access), it's much faster.
I have 15 tables like this (with even more records), and I need to make these imports every day. I run this procedure automatically every night (between 20:00 and 6:00).
Is there an easier way to do this?
What is the best way to do this?
This feels like a good case for SSIS to me.
You can create a data flow from a flat file (as the data source) to a SQL DB (as the destination).
You can add some validation or selection steps in between.
You can easily find tutorials like this one online.
Alternatively, you can do what Gord mentioned, and import the data from a text file into a local Access table and then using a single INSERT INTO LinkedTable SELECT * FROM LocalTable to copy the data to the SQL Server table.

Importing data from MS Excel to SQL Server 2008 R2

I've a Microsoft Excel file with 25 columns and I used the data import wizard on SQL Server 2008 R2 studio to import it to already existing table. But, it only maps 14 columns and ignores the rest. Does it have a column number limit or there is a problem with my data?
Can you give some example of your data?
In the mean time... When you're performing your import, I assume you are reaching the Select Source Tables and Views Page;
At this stage, you can edit your column mappings by pressing the Edit Mappings button. This will show the following screen;
Make sure that all your columns are selected there.
Also, on the following screen, it will warn of any mismatches in data;
Are you seeing any warnings here?
I'm not familiar much with SQL Server, but according to this: Excel file Import to Sql server 2008 - column limit for Excel is 255 which is far beyond your numbers. Most likely problem is in your data, but without having the example it's impossible to say what's wrong.
You can do the other way around.
Step 1. Import sheet in a new table.
Step 2. Use Bulk Query to insert into your prefered table
I think this will solve your Problem.
Try to create the Excel columns and table structure in the same format and structure as ms sql table is. Open the excel select the row (whole row by clicking on row number) select the whole data till end of the data (selected first row to last row). press ctrl+c
edit the ms sql table, go to last row where you can find blank.
Select that row by clicking on row button and press ctrl+v.
Done you're excel data will be inserted into sql table.
I tried it in MS SQL 2008

Resources