How to Export binary data in SqlServer to file using DTS - sql-server

I have an image column in a sql server 2000 table that is used to store the binary of a pdf file.
I need to export the contents of each row in the column to an actual physical file using SqlServer 2000 DTS.
I found the following method for vb at http://www.freevbcode.com/ShowCode.asp?ID=1654&NoBox=True
Set rs = conn.execute("select BinaryData from dbo.theTable")
FileHandle = FreeFile
Open ("AFile") For Binary As #FileHandle
ByteLength = LenB(rs("BinaryData"))
ByteContent = rs("BinaryData").GetChunk(ByteLength)
Put #FileHandle, ,ByteContent
Close #FileHandle
Unfortunately, the DTS script task is VBSCript, not VB, and it throws up on the AS keyword in the third line.
Any other ideas?

Writing binary files is a notoriously difficult task in VBScript. The only direct file operations exposed by VBScript live in the FileSystemObject object, which only supports writing text files. The only viable option is to use ADO Stream objects, which still is cumbersome because VBScript does not support passing script-created Byte arrays to COM objects, which is required to be able to write arbitrary binary data.
Here's a technique using ADO streams that probably won't work but might put you on the track to the right solution.
adTypeBinary = 1
Set rs = CreateObject("ADODB.Recordset")
rs.Open "SELECT binary_data FROM dbo.data_table", "connection string..."
Set strm = CreateObject("ADODB.Stream")
strm.Type = adTypeBinary
strm.Open
strm.Write rs.Fields("binary_data").GetChunk( _
LenB(rs.Fields("binary_data").Value))
strm.SaveToFile "binarydata.bin"
strm.Close
rs.Close
I experimented with this code using Access, and unfortunately it didn't seem to work. I didn't get the same binary data that was in the table I created, but I haven't bothered going into a hex editor to see what was different.
If you get errors from an ADO operation, you can get the actual messages by adding an "On Error Resume Next" to the top of the script and using this code.
For Each err In rs.ActiveConnection.Errors
WScript.Echo err.Number & ": " & err.Description
Next

I would go with SQL Server Integration Services (SSIS) instead of DTS, if at all possible, and use a Script Task which would allow you to use VB.NET.
You can connect to your SQL Server 2000 data source and point the exported output to a file.

Related

Query data from SQL to MS Access: Local Tables vs Pass-Through Tables

I've created an application that uses the following logic to query data from SQL to my MS Access App.
Using an ODBC connection I execute a stored procedure
Using This is assigned as a Pass-Through Query to pull the data locally.
It looks something like this:
strSQL = "EXEC StoredProcedure " & Variable & "
Call ChangeQueryDef("qryPassThrough", strSQL)
Call SQLPassThrough(strQDFName:="qryPassThrough", _
strSQL:=strSQL, strConnect:=gODBCConn)
Me.frmDataSheet.Form.RecordSource = "qryPassThrough"
But, recently we have upgraded our SQL Server to 2016 using a high availability failover system - hence our connection string has changed to connect to a listener like so:
gODBCConn = "ODBC;Driver= {SQL Server Native Client 11.0};Trusted_Connection=Yes;Regional=Yes;Database=" & varDB & ";MultiSubnetFailover=Yes;IntegratedSecurity=SSPI;Server=tcp:SERVER_LISTENER,1433;"
However, it looks like using SQL Server Native Client in the connection string is not the same as what we originally had which was SQL Server. Certain data types have changed and do not work in Access.
Is there a better way for me to query data from SQL and persist/display this data in access using ADO or an alternative method?
EDIT Based on Comment:
The issue I'm having is that I have tables in SQL using the data type: Decimal(12,2). With some testing and experimenting this seems to fail when using an ODBC pass-through query. But changing the data type to Float seems to work fine. Then there are other datatypes which seem to error too which I've not managed to find yet. It just seems there are a few difference which I'm not aware of and I'm keen to find a better way to load data into my Access App.
EDIT 2
This is the error message I get relating to the data type issue.
Sounds like you're not really interested in making the underlying data structure compatible with Access, so:
How to load an ADODB recordset into a datasheet form
Create the form
First, create a datasheet form. For this example, we're going to name our form frmDynDS. Populate the form with 256 text boxes, named Text0 to Text255. To populate the form with the text boxes, you can use the following helper function while the form is in design view:
Public Sub DynDsPopulateControls()
Dim i As Long
Dim myCtl As Control
For i = 0 To 255
Set myCtl = Application.CreateControl("frmDynDS", acTextBox, acDetail)
myCtl.NAME = "Text" & i
Next i
End Sub
VBA to bind a recordset to the form
First, we're going to allow the form to persist, by allowing it to reference itself:
(all on in the code module for frmDynDS)
Public Myself As Object
Then, we're going to add VBA to make it load a recordset. I'm using Object instead of ADODB.Recordset to allow it to both take DAO and ADODB recordsets.
Public Sub LoadRS(myRS As Object)
Dim i As Long
Dim myTextbox As textbox
Dim fld As Object
i = 0
With myRS
For Each fld In myRS.Fields
Set myTextbox = Me.Controls("Text" & i)
myTextbox.Properties("DatasheetCaption").Value = fld.NAME
myTextbox.ControlSource = fld.NAME
myTextbox.ColumnHidden = False
i = i + 1
Next fld
End With
For i = i To 255
Set myTextbox = Me.Controls("Text" & i)
myTextbox.ColumnHidden = True
Next i
Set Me.Recordset = myRS
End Sub
Use the form
(all in the module of the form using frmDynDS)
As an independent datasheet form
Dim frmDS As New Form_frmDynDS
frmDS.Caption = "My ADO Recordset"
frmDS.LoadRS MyAdoRS 'Where MyAdoRS is an open ADODB recordset
Set frmDS.Myself = frmDS
frmDS.Visible = True
frmDS.SetFocus
Note that you're allowed to have multiple instances of this form open, each bound to different recordsets.
As a subform (leave the subform control unbound)
Me.MySubformControl.SourceObject = "frmDynDS"
Me.MySubformControl.Form.LoadRS MyAdoRS 'Where MyAdoRS is an open ADODB recordset
Warning: Access uses the command text when sorting and filtering the datasheet form. If it contains a syntax error for Access (because it's T-SQL), you will get an error when trying to sort/filter. However, when the syntax is valid, but the SQL can't be executed (for example, because you're using parameters, which are no longer available), then Access will hard crash, losing any unsaved changes and possibly corrupting your database. Even if you disable sorting/filtering, you can still trigger the hard crash when attempting to sort. You can use comments in your SQL to invalidate the syntax, avoiding these crashes.
You previously used the pretty ancient, original ODBC Driver for SQL Server simply named SQL Server. You made the right decision to use a newer driver to support your failover cluster. But I would not recommend to use SQL Server Native Client. Microsoft says, It is not recommended to use this driver for new development.
Instead I would use the Microsoft ODBC Driver 13.1 for SQL Server. This is the most recent and recommended (by Microsoft) ODBC Driver for SQL Server.
Your main issue seems to be a translation issue between Access and SQL Server via the ODBC layer. So, using the more modern driver might very well make this problem go away. - I do not know if it solves your problem, but this is the very first thing I would try.

Excel data connection to SQL dB error

I crafted a macro in an Excel workbook to extract a subset of data from a SQL database based on user input.
The macro prompts the user for a parameter input and inserts that parameter into a ready-made stored procedure configured into a an Excel data connection - see below for my vba:
Sub RefreshDBQuery()
Dim Val As Integer
Application.ScreenUpdating = False
Worksheets("Adjustable CF").Select
Val = InputBox("Enter valid 4 digit number", , 1907)
Sheets("TestData").Visible = True
Worksheets("TestData").Select
Worksheets("TestData").Range("A1").Select
ActiveCell.Value = Val
With ActiveWorkbook.Connections("MacroExtraction 2Server").OLEDBConnection
.CommandText = "EXEC dbo.prV_FlowExtract '" & Range("A1").Value & "'"
End With
ActiveWorkbook.Connections("MacroExtraction 2Server").Refresh
Sheets("TestData").Visible = False
End Sub
When I run it - it works fine and additionally, since it's modifying an existing data connection ( the one I previously configured), I notice a odc file in a folder called "My Data Sources" under My Documents:
However, when I send this workbook over to a colleague to run the macro and to extract data - the macro is able to run up to a point, and she receives an error:
I ask her to open up the folder "My Data Sources" and I don't see an odc file:
My question is: what am I missing? Or rather what is my colleague missing in order to get her macro to work on her local machine?
I checked with the dB administrator who said that she has the permissions necessary to access the server, so that's why I am picking on the lack of the odc as a cause for my concern. Should I copy my odc file and send it to her to copy into her Data Sources folder? Should I rewrite the macro and re-setup the data connection on her local machine? Anyone with experience to comment would be much appreciated! Thanks!
The macro alone does not contain all the necessary info (server name, for example?). Try running "NewSQLServerConnection.odc" in you colleague's my data sources location, complete the necessary data, make sure the connection name is the same as in your macro, and then the macro should work.
Hope this helps!

Automated export of a table/query from SQL SVR 2012 into Excel 2007 and subsequent VBA formatting macro?

I’ve been given the task of updating some code which isn’t working after our SQL SVR 200 to SQL SVR 2012 (don't laugh) conversion. We have an automated task that creates 500ish xls files in a line-by-line manner by using the old sp_OACreate command. In 2000, this job always takes hours and is rickety at best. Needless to say, it doesn’t work at all in 2012. That's what you get for never upgrading?
I rewrote the job by constructing a table, adding info into the table, & doing a bulk export into xls (using OpenRowSet). The new task ran in less than 15 minutes and I was ecstatic. Then Ops complained that the new files weren’t formatted…
I looped back and tried to get the formatting into the automation. That’s where everything fell apart.
Use OpenRowSet and export into an “xls template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work.
Use OpenRowSet and export into an “xlsm template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work.
Use BCP and export into an “xls template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work.
Use BCP and export into an “xlsm template file” that has an autorun vba code to do the formatting when the file is opened for the first time. Didn’t work.
Use BCP/OpenRowSet and export into xls/xlsx and try to execute a vba code from another workbook…Didn’t work.
Use BCP/OpenRowSet, export into a “dummy” xls/xlsx, use a bat file to recopy the file, resave, etc…Didn’t work.
Use BCP/OpenRowSet, export into a “dummy” txt/csv, use a bat file to convert the file into excel, resave, etc…Didn’t work.
Etc. Didn't work.
I tried every combination, every export method, every filetype, etc. Certain methods allowed me to export but no formatting. Other methods wouldn’t even export anything. No method permitted the export/format, though.
Then I discovered the “REAL” problem ==> When I export using BCP/OpenRowSet and then try to open the file, I always get the “file you are trying to open is in a different format than specified by the file extension” error (FYI Using Excel 2007). I had sort of ignored this error, but after days of banging my head against the wall I can now see that this error was leading me to the real problem all along. The export file (regardless of xls/xlsm/xlsx) is not an actual excel file; it’s really just a bunch of html tags. This is why the formatting vba code won’t work; it’s not an actual excel file! No matter what I do, it’s still not a real excel file. Formatting will never work because it's simply not a real excel file.
So, I need an automated method to export a table to some sort of excel format (xls/xlsm/xlsx) and then execute a vba code to format (bold, column width, number/date formatting, etc.) the newly-exported file. This seems like such a routine task…but I see now that routine <> easy. I’ve seen references to NPOI and ClosedXML in forums, but I simply can’t believe that I need additional 3rd party software to accomplish this task.
You could write some VBA code in an Excel Spreadsheet to go get the data from your database and then do the formatting VBA after. The below code should be a good starting point.
Please note that you will have to add a Reference to Microsoft ActiveX Data Objects Library for this code to function. Also, you may need to change the Provider in you connection string deepening on you systems configuration. The below code utilizes the SQL Server Native Client 11.
Dim cn As New ADODB.Connection
Dim SQLrs As New ADODB.Recordset
Dim SQLCommand As String
SQLCommand = "SELECT...FROM..."
cn.ConnectionString = "Provider=SQLNCLI11;Server=xxxxxxx;DataBase=xxxxxx;Trusted_Connection=yes;"
cn.Open
cn.Execute "SET NOCOUNT ON" ''required for complex T-SQL queries
SQLrs.CursorLocation = adUseClient
Call SQLrs.Open(SQLCommand, cn, adOpenStatic, adLockBatchOptimistic)
ActiveWorkbook.Sheets("sheet1").Range("a1").CopyFromRecordset SQLrs
SQLrs.Close
Set SQLrs = Nothing
cn.Close
Set cn = Nothing

Bulk Insert from CSV file to MS SQL Database

I have this working script that I use to BULK INSERT A CSV FILE.
The code is:
' OPEN DATABASE
dim objConn,strQuery,objBULK,strConnection
set objConn = Server.CreateObject("ADODB.Connection")
objConn.ConnectionString = "Driver={SQL Server Native Client 11.0};Server=DemoSrvCld;Trusted_Connection=no;UID=dcdcdcdcd;PWD=blabla;database=demotestTST;"
objConn.Open strConnection
set objBULK = Server.CreateObject("ADODB.Recordset")
set objBULK.ActiveConnection = objConn
dim strAPPPATH
strAPPPATH="C:\DEMO001Test.CSV"
strQuery = "BULK INSERT EFS_OlderStyle FROM '" & strAPPPATH & "' WITH (firstrow=1, FIELDTERMINATOR=',', ROWTERMINATOR='\n')"
Set objBULK= objConn.Execute(strQuery)
objConn.Close
HERE IS A EXAMPLE OF THE .CSV FILE:
Date,Time,Card Number,Driver Id,Driver Name,Unit No,Sub-Fleet,Hub Miles,Odo Miles,Trip No,Invoice,T/S Code,In Dir,T/S Name,T/S City,ST,Total Inv,Fee,PPU,Fuel_UOM,Fuel_CUR,RFuel_UOM,RFuel_CUR,Oil_CUR,Add_CUR,Cash Adv,Tax,Amt Billed,Svc Bill,Chain,Ambest,MPU
10/08/13,03:20,70113531460800693,,,2100,,,,,0454591156,546200,Y,PILOT QUARTZSITE 328,QUARTZSITE,AZ,742.30,1.00,3.749,149.000,558.60,49.00,183.70,0.00,0.00,0.00,0.00,743.30,S, ,N,0.0
10/08/13,07:03,70110535170800735,,,6210,,,,,343723,512227,Y,PETRO WHEELER RIDGE,LEBEC,CA,678.78,1.00,4.169,139.140,580.08,23.68,98.70,0.00,0.00,0.00,0.00,679.78,S, ,N,0.0
But the .CSV FILE I HAVE NOW IS DIFFRENT then the one above.
HERE IS A EXAMPLE OF THE CURRENT .CSV FILE:
"BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE",1.00,1.00,"E","004ACS","02812","365","-","-","0",0.00,"-","JOHN S ",11/4/2013,"16:18:49E","IC","N",3257.00,"IRVING HOULTON","HOULTON","ME",3.95,3.95,121.57,480.08,0.00,0.00,0.00,0.00,480.08,1.50,0.00,481.58
"BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE",1.00,2.00,"E","014ACI","976234","430","-","-","0",0.00,"-","STACY ",11/4/2013,"00:21:16E","F","Y",8796.00,"PILOT 405","TIFTON","GA",3.77,3.77,172.65,650.73,0.00,0.00,0.00,0.00,650.73,1.50,0.00,652.23
I have edited the ms sql database fields to reflect the new .csv fields but the old and new .csv files do not store the info. in the same way. How do I fix this so that it works ?
I was thinking to first remove all of the " and then to remove all but one "BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE",
then save the .csv file and then reopen it.
But I think/hope there is another way ?
Please help...
Thank you.
Sure: there are a few ways you can work around this problem. The right solution for you will depend on the time and energy you have to dedicate to this problem, as well as whether this is a one time import or a process you want to streamline.
A few solutions:
1. Change the formating of your CSV file to resemble the old version.
this can be done realtively easily:
Download a text editor like notepad++
Open your CSV file in this editor
Do a find/Replace operation for:
"BRANCH","CARD","BILL_TYPE","AUTH_CODE","INVOICE","UNIT","EMP_NUM","TRIP","TRAILER","HUB/SPEED","VEH_LICENSE","DRIVER","DATE","TIME","CHAIN","IN_NETWORK","TS#","TS_NAME","TS_CITY","TS_STATE","PPG","NET_PPG","FUEL_GALS","FUEL_AMT","RFR_GALS","RFR_AMT","CASH","MISC","INV_TOTAL","FEE","DISC","INV_BALANCE"
replace with: ""
Finally add the line above as your header- quickly reformatting your new file in the same format as your old file.
Note: this may be the best option if you have a one time import.
2. Make the above changes in your code programatically
Since the beginning of each line contains the fields you wish to ignore, you can easily truncate each line based on the number of characters. The String.Replace function can be used to replace the initial (ignorable) part of the line with String.Empty before it is inserted into the DB.

Bulk importing text files / VB2005 / SQL Server 2005

I've inherited a .NET app to support / enhance which reads in a couple of files of high hundreds of thousands of rows, and one of millions of row.
The original developer left me code like :-
For Each ModelListRow As String In ModelListDataArray
If ModelListRow.Trim.Length = 0 Or ModelListRow.Contains(",") = False Then
GoTo SKIP_ROW
End If
Dim ModelInfo = ModelListRow.Split(",")
Dim ModelLocation As String = UCase(ModelInfo(0))
Dim ModelCustomer As String = UCase(ModelInfo(1))
Dim ModelNumber As String = UCase(ModelInfo(2))
If ModelLocation = "LOCATION" Or ModelNumber = "MODEL" Then
GoTo SKIP_ROW
End If
Dim MyDataRow As DataRow = dsModels.Tables(0).NewRow
MyDataRow.Item("location") = ModelLocation.Replace(vbCr, "").Replace(vbLf, "").Replace(vbCrLf, "")
MyDataRow.Item("model") = ModelNumber.Replace(vbCr, "").Replace(vbLf, "").Replace(vbCrLf, "")
dsModels.Tables(0).Rows.Add(MyDataRow)
SKIP_ROW:
Next
and it takes an age (well, nearly half an hour) to import these files.
I suspect there's a MUCH better way to do it. I'm looking for suggestions.
Thanks in advance.
Take a look at BULK INSERT.
http://msdn.microsoft.com/en-us/library/ms188365(v=SQL.90).aspx
Basically you point SQL Server at a text file in CSV format and it does all the logic of pulling the data into a table. If you need to massage it more than that, you can pull the text file into a staging location in SQL Server, and then run a stored proc to massage it into the format you are looking for.
The main options (apart from writing your own code from scratch) are:
BULK INSERT or bcp.exe, which work well if your data is cleanly formatted
SSIS, if you need workflow, data type transformations, data cleansing etc.
.NET SqlBulkCopy API
jkohlhepp's suggestion about pulling data into a staging table then cleaning it is a good one and a very common pattern in ETL processes. But if your "massaging" isn't easy to do in TSQL then you will probably need some .NET code anyway, whether it's in SSIS or in a CLR procedure.
Personally I would use SSIS in your case, because it looks like the data is not cleanly formatted so you will probably need some custom code to clean/re-format the data on its way to the database. However it does depend on what you're most comfortable/productive with and what existing tools and standards you have in place.
Dim ExcelConnection As New System.Data.OleDb.OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\MyExcelSpreadsheet.xlsx;Extended Properties=""Excel 12.0 Xml;HDR=Yes""")
ExcelConnection.Open()

Resources