Bulk Insert from CSV file to MS SQL Database - sql-server
I have this working script that I use to BULK INSERT A CSV FILE.
The code is:
' OPEN DATABASE
dim objConn,strQuery,objBULK,strConnection
set objConn = Server.CreateObject("ADODB.Connection")
objConn.ConnectionString = "Driver={SQL Server Native Client 11.0};Server=DemoSrvCld;Trusted_Connection=no;UID=dcdcdcdcd;PWD=blabla;database=demotestTST;"
objConn.Open strConnection
set objBULK = Server.CreateObject("ADODB.Recordset")
set objBULK.ActiveConnection = objConn
dim strAPPPATH
strAPPPATH="C:\DEMO001Test.CSV"
strQuery = "BULK INSERT EFS_OlderStyle FROM '" & strAPPPATH & "' WITH (firstrow=1, FIELDTERMINATOR=',', ROWTERMINATOR='\n')"
Set objBULK= objConn.Execute(strQuery)
objConn.Close
HERE IS A EXAMPLE OF THE .CSV FILE:
Date,Time,Card Number,Driver Id,Driver Name,Unit No,Sub-Fleet,Hub Miles,Odo Miles,Trip No,Invoice,T/S Code,In Dir,T/S Name,T/S City,ST,Total Inv,Fee,PPU,Fuel_UOM,Fuel_CUR,RFuel_UOM,RFuel_CUR,Oil_CUR,Add_CUR,Cash Adv,Tax,Amt Billed,Svc Bill,Chain,Ambest,MPU
10/08/13,03:20,70113531460800693,,,2100,,,,,0454591156,546200,Y,PILOT QUARTZSITE 328,QUARTZSITE,AZ,742.30,1.00,3.749,149.000,558.60,49.00,183.70,0.00,0.00,0.00,0.00,743.30,S, ,N,0.0
10/08/13,07:03,70110535170800735,,,6210,,,,,343723,512227,Y,PETRO WHEELER RIDGE,LEBEC,CA,678.78,1.00,4.169,139.140,580.08,23.68,98.70,0.00,0.00,0.00,0.00,679.78,S, ,N,0.0
But the .CSV FILE I HAVE NOW IS DIFFRENT then the one above.
HERE IS A EXAMPLE OF THE CURRENT .CSV FILE:
"BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE",1.00,1.00,"E","004ACS","02812","365","-","-","0",0.00,"-","JOHN S ",11/4/2013,"16:18:49E","IC","N",3257.00,"IRVING HOULTON","HOULTON","ME",3.95,3.95,121.57,480.08,0.00,0.00,0.00,0.00,480.08,1.50,0.00,481.58
"BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE",1.00,2.00,"E","014ACI","976234","430","-","-","0",0.00,"-","STACY ",11/4/2013,"00:21:16E","F","Y",8796.00,"PILOT 405","TIFTON","GA",3.77,3.77,172.65,650.73,0.00,0.00,0.00,0.00,650.73,1.50,0.00,652.23
I have edited the ms sql database fields to reflect the new .csv fields but the old and new .csv files do not store the info. in the same way. How do I fix this so that it works ?
I was thinking to first remove all of the " and then to remove all but one "BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE",
then save the .csv file and then reopen it.
But I think/hope there is another way ?
Please help...
Thank you.
Sure: there are a few ways you can work around this problem. The right solution for you will depend on the time and energy you have to dedicate to this problem, as well as whether this is a one time import or a process you want to streamline.
A few solutions:
1. Change the formating of your CSV file to resemble the old version.
this can be done realtively easily:
Download a text editor like notepad++
Open your CSV file in this editor
Do a find/Replace operation for:
"BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE"
replace with: ""
Finally add the line above as your header- quickly reformatting your new file in the same format as your old file.
Note: this may be the best option if you have a one time import.
2. Make the above changes in your code programatically
Since the beginning of each line contains the fields you wish to ignore, you can easily truncate each line based on the number of characters. The String.Replace function can be used to replace the initial (ignorable) part of the line with String.Empty before it is inserted into the DB.
Related
[VB.NET][ACCESS] How do I check for if a database exists?
I cant find anything online to help. I want to create a table if it doesnt already exist, or populate a listbox with what is stored in said table if it DOES exist. All I have so far is the populate and create table subroutines, but have no idea how to check the database so far. Thank you
Checking if an MSAccess DATABASE exists or not is pretty simple because it is just a single file. So using File.Exists is enough Suppose that your MDB file is Dim accessFilePath = "D:\temp\myDatabase.mdb" If File.Exists(accessFilePath) Then ... file exists End if Of course getting the content of the file (in terms of TABLES and QUERY) is a different thing and requires to open the connection and get the SCHEMA informations Dim cnnString = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & accessFilePath Using con = new OleDbConnection(cnnString) con.Open() Dim schema = con.GetSchema("Tables") For Each row As DataRow in schema.Rows Console.WriteLine(row.Field(Of String)("TABLE_NAME")) Next End Using See how GetSchema works and what are its possible parameters and results
Automated export of a table/query from SQL SVR 2012 into Excel 2007 and subsequent VBA formatting macro?
I’ve been given the task of updating some code which isn’t working after our SQL SVR 200 to SQL SVR 2012 (don't laugh) conversion. We have an automated task that creates 500ish xls files in a line-by-line manner by using the old sp_OACreate command. In 2000, this job always takes hours and is rickety at best. Needless to say, it doesn’t work at all in 2012. That's what you get for never upgrading? I rewrote the job by constructing a table, adding info into the table, & doing a bulk export into xls (using OpenRowSet). The new task ran in less than 15 minutes and I was ecstatic. Then Ops complained that the new files weren’t formatted… I looped back and tried to get the formatting into the automation. That’s where everything fell apart. Use OpenRowSet and export into an “xls template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work. Use OpenRowSet and export into an “xlsm template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work. Use BCP and export into an “xls template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work. Use BCP and export into an “xlsm template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work. Use BCP/OpenRowSet and export into xls/xlsx and try to execute a vba code from another workbook…Didn’t work. Use BCP/OpenRowSet, export into a “dummy” xls/xlsx, use a bat file to recopy the file, resave, etc…Didn’t work. Use BCP/OpenRowSet, export into a “dummy” txt/csv, use a bat file to convert the file into excel, resave, etc…Didn’t work. Etc. Didn't work. I tried every combination, every export method, every filetype, etc. Certain methods allowed me to export but no formatting. Other methods wouldn’t even export anything. No method permitted the export/format, though. Then I discovered the “REAL” problem ==> When I export using BCP/OpenRowSet and then try to open the file, I always get the “file you are trying to open is in a different format than specified by the file extension” error (FYI Using Excel 2007). I had sort of ignored this error, but after days of banging my head against the wall I can now see that this error was leading me to the real problem all along. The export file (regardless of xls/xlsm/xlsx) is not an actual excel file; it’s really just a bunch of html tags. This is why the formatting vba code won’t work; it’s not an actual excel file! No matter what I do, it’s still not a real excel file. Formatting will never work because it's simply not a real excel file. So, I need an automated method to export a table to some sort of excel format (xls/xlsm/xlsx) and then execute a vba code to format (bold, column width, number/date formatting, etc.) the newly-exported file. This seems like such a routine task…but I see now that routine <> easy. I’ve seen references to NPOI and ClosedXML in forums, but I simply can’t believe that I need additional 3rd party software to accomplish this task.
You could write some VBA code in an Excel Spreadsheet to go get the data from your database and then do the formatting VBA after. The below code should be a good starting point. Please note that you will have to add a Reference to Microsoft ActiveX Data Objects Library for this code to function. Also, you may need to change the Provider in you connection string deepening on you systems configuration. The below code utilizes the SQL Server Native Client 11. Dim cn As New ADODB.Connection Dim SQLrs As New ADODB.Recordset Dim SQLCommand As String SQLCommand = "SELECT...FROM..." cn.ConnectionString = "Provider=SQLNCLI11;Server=xxxxxxx;DataBase=xxxxxx;Trusted_Connection=yes;" cn.Open cn.Execute "SET NOCOUNT ON" ''required for complex T-SQL queries SQLrs.CursorLocation = adUseClient Call SQLrs.Open(SQLCommand, cn, adOpenStatic, adLockBatchOptimistic) ActiveWorkbook.Sheets("sheet1").Range("a1").CopyFromRecordset SQLrs SQLrs.Close Set SQLrs = Nothing cn.Close Set cn = Nothing
Text File to Array in Access VBA
I am looking to load a .txt file into a VBA array (in an access VBA), manipulate it there, and then paste the manipulated array into an access table. I will loop this macro then through round about 500 .txt files and populate the database with years of data. I have done it before using Excel in this way:Populating the sheet for 1 .txt file, manipulating the data, loading into the database, clearing the sheet, and loading the next txt file and loop till all files have been processed. However, this takes years and it becomes obvious that Excel is not really needed, since everything is stored in Access anyway. Since then, I have tried to figure out how to do it in access straight away, without using excel, but I have been struggling. I found some nice code on this board here: Load csv file into a VBA array rather than Excel Sheet and tried to amend it so it would work for me. Especially The_Barman's Answer at the bottom seems simple and very interesting. I am new to arrays, VBA, SQL, basically everything, so maybe there is some big mistake I am making here that is easy to resolve. See my code below: Sub LoadCSVtoArray() Dim strfile As String Dim strcon As String Dim cn As ADODB.Connection Dim rs As Recordset Dim rsARR() As Variant Set cn = New ADODB.Connection strcon = "Provider=Microsoft.JET.OLEDB.4.0;Data Source=" & "\\filename.txt" & ";Extended Properties=""text;HDR=Yes;FMT=Delimited"";" cn.Open strcon strSQL = "SELECT * filename.txt;" Set rs = cn.Execute(strSQL) rsARR = WorksheetFunction.Transpose(rs.GetRows) rs.Close Set cn = Nothing [a1].Resize(UBound(rsARR), UBound(Application.Transpose(rsARR))) = rsARR End Sub I dont even know if the bottom part of the code works, because an error message pops up that the file is not found. The interesting thing is, if I debug and copy the value of strcon into windows "run", it opens the correct file. So I guess the path is correct? Can I even open a .txt file through an ADODB connection? Right now I am a bit confused if this is possible and if it is the best solution. Some more background regarding the text files I am trying to save into the array: -They are output from another program, so it is always the same structure and very oreganized it comes in this format: Column a Column b .... data 1 data 1 data 2 Data 2 ... and so on. If possible, I would like to retain this structure, and even safe it as a table with the first row as column headers.
The Data Source is the folder path containing the file and the file is the table (SELECT * FROM ..). Replace "\\filename.txt" in strcon with the folder path. http://www.connectionstrings.com/textfile/
How to convert data in excel spreasheet forms into format for upload into a database
I need a good way to convert input data from 900 spreadsheets into a format suitable for upload to a relational database (XML or flat file/s). The spreadsheets are multi-sheet, multi-line Excel 2007 each one consisting of 7 forms (so its definitely not a simple grid). There will be no formula data to get, just text, dates, integer data. The 900 spreadsheets are all in the same format. I will need some kind of scripted solution. I'm expecting I should be able to do this with excel macros (and I expect a fancy scriptable editor could do it too) or possibly SSIS. Can someone tell me how you would approach this if it was yours to do? Can anyone give a link to some technical info on a good way to do this? I'm new to excel macros but used to programming and scripting languages, sql, others. Why? We're using spreadsheet forms as an interim solution and I then need to get the data into the database.
You probably want to write data out to a plain text file. Use the CreateTextFile method of FileSystemObject. Documentation here: http://msdn.microsoft.com/en-us/library/aa265018(v=vs.60).aspx There are many examples on the web of how to iterate over worksheets, capture the data and then use WriteLine method. Sub ExampleTextFile() Dim fso as Object Dim oFile as Object Dim fullExport as String 'Script that will capture data from worksheet belongs here _ ' use the fullExport string variable to hold this data, for now we will _ ' just create a dummy string for illustration purposes fullExport = "Example string contents that will be inserted in to my text file!" Set fso = CreateObject("Scripting.FileSystemObject") 'In the next line, replace "C:\filename.txt" with the specified file you want to create set oFile = fso.CreateTextFile("C:\filename.txt", Overwrite:=True, unicode:=True) oFile.WriteLine(fullExport) '<-- inserts the captured string to your new TXT file oFile.Close Set fso = Nothing Set oFile = Nothing End Sub If you have character encoding issues (I recently ran in to a problem with UTF16LE vs. UTF8 encoding, you will need to use the ADODB.Stream object, but that will require a different method of writing the file.
How to Export binary data in SqlServer to file using DTS
I have an image column in a sql server 2000 table that is used to store the binary of a pdf file. I need to export the contents of each row in the column to an actual physical file using SqlServer 2000 DTS. I found the following method for vb at http://www.freevbcode.com/ShowCode.asp?ID=1654&NoBox=True Set rs = conn.execute("select BinaryData from dbo.theTable") FileHandle = FreeFile Open ("AFile") For Binary As #FileHandle ByteLength = LenB(rs("BinaryData")) ByteContent = rs("BinaryData").GetChunk(ByteLength) Put #FileHandle, ,ByteContent Close #FileHandle Unfortunately, the DTS script task is VBSCript, not VB, and it throws up on the AS keyword in the third line. Any other ideas?
Writing binary files is a notoriously difficult task in VBScript. The only direct file operations exposed by VBScript live in the FileSystemObject object, which only supports writing text files. The only viable option is to use ADO Stream objects, which still is cumbersome because VBScript does not support passing script-created Byte arrays to COM objects, which is required to be able to write arbitrary binary data. Here's a technique using ADO streams that probably won't work but might put you on the track to the right solution. adTypeBinary = 1 Set rs = CreateObject("ADODB.Recordset") rs.Open "SELECT binary_data FROM dbo.data_table", "connection string..." Set strm = CreateObject("ADODB.Stream") strm.Type = adTypeBinary strm.Open strm.Write rs.Fields("binary_data").GetChunk( _ LenB(rs.Fields("binary_data").Value)) strm.SaveToFile "binarydata.bin" strm.Close rs.Close I experimented with this code using Access, and unfortunately it didn't seem to work. I didn't get the same binary data that was in the table I created, but I haven't bothered going into a hex editor to see what was different. If you get errors from an ADO operation, you can get the actual messages by adding an "On Error Resume Next" to the top of the script and using this code. For Each err In rs.ActiveConnection.Errors WScript.Echo err.Number & ": " & err.Description Next
I would go with SQL Server Integration Services (SSIS) instead of DTS, if at all possible, and use a Script Task which would allow you to use VB.NET. You can connect to your SQL Server 2000 data source and point the exported output to a file.