In SQL Server 2008, I have a strongly typed data set with a table:
TABLE
ID (Guid)
Value (varchar(50))
This this table, Value actually represents an encrypted value on the database, which becomes decrypted after reading from this table on my server.
In Visual Studio; I have a Dataset with my table, which looks like:
TABLE
ID (Guid)
Value (float)
I want to know if there is a way, in a DataSet, to call my decryption methods on Value when I am calling my Fill Query on the TableAdapter for this Table.
Is there anyway to extend the DataSet XSD to support this sort of data massaging when reading data?
In addition to this, is there a way when inserting/updating records in this table to write strings to encrypted values?
NOTE:
All Encrypt/Decryption code is being performed on the client to the database, not on the database itself.
The Fill() method is going to execute whatever SQL is in the SelectCommand property of the DataAdapter. It's certainly possible to customize the SQL to "massage" data as it comes in.
Your issue is made more complex by the need to execute some .NET decryption. If you really want to do this and it is of high value to you, you could install a .NET assembly in the SQL Server database. Once this was done, you should be able to specify a custom SelectCommand that calls the code in your ,NET assembly to decrypt the data at select-time.
But that seems like an awful lot of work for very little reward. It's probably easier and more efficient to simply post-process the dataset and decrypt there. :)
Related
I have looked and found some instances there something similar is being done for websites etc....
I have a SQL table that I am accessing in FileMaker Pro (Through ESS) via an ODBC connection to the SQL database and I have everything I need except there is one field(LNL_BLOB) in one table (duo.MMOBJS) which is an image "(image, null)" which cannot be accessed via the ODBC connection.
What I am hopping to accomplish is find a way that when an image is placed in the field, it is ALSO converted to Base64 in another field in the same table. Also, the database creator has a "View" (Foreign Concept to us Filemaker Developers) with this same data called "dbo.VW_BLOB_IMAGES" if that is helpful.
If there is a field with Base64 text, within FileMaker I can decode it to get the image.
What thoughts do you all have? Is there and even better way?
NOTE: I am using many tables and lots of the data in the app that I have made, this image is not the only reason I have created the ODBC connection.
Table
View
Well, one way to get base64 out of SQL would be to trick the XML engine in SQL to convert your column to base64, then strip out the XML:
SELECT SUBSTRING(Q.Base64Data, 7, LEN(Q.Base64Data)-9)
FROM (SELECT
(
SELECT LNL_BLOB AS B
FROM duo.MMOBJS
FOR XML raw('r'), BINARY BASE64
) AS [Base64Data]) AS [Q]
You'd probably want to add that to your select statement or a view, rather than add it to the table; but, you could write a trigger that would maintain the field using that definition.
I am new to using ssis and am on my third package. We are taking data from Oracle into Sql Server. On my oracle table, the unique key is called recnum and is numeric(12,0). In this particular package, I am trying to take the record from oracle, lookup in a sql server table to see if that unique key is found, and if not add the record to the sql server table. My issue is it wouldn't find a match. After much testing, I came up with the following method that works. But I don't understand why I had to do this.
How I currently have it working:
I get the data from oracle. In my next step, I added a derived column that uses the oracle column. (The expression is just that field, no other formatting.) Then in the lookup I use the derived column instead of the column from Oracle.
We had already done this on another table where the unique key was numeric(8,0) and it worked ok without needing a derived column.
SSIS is very fussy about data types, lookups only work nicely if data types match.
Double click on the Data Path lines between Data Flow objects to check data types. I use Data Conversion tasks or CAST statements to force matching data types when I use lookups.
Hope this helps.
I am processing data from a database with millions of rows. I am pulling a batch of 1000 items from the database and processing them without a problem. I am not loading the whole entity and I am just pulling down a few columns of data for the batch.
What I want to do is mark the 1000 rows as processed with a single SQL command.
Something like:
UPDATE dbo.Clients
SET HasProcessed = 1
WHERE ClientID IN (...)
The ... is just a list of integers.
Context:
Azure SQL Server 2012
Entity Framework
Database.ExecuteSqlCommand
Ideas:
I know I could build the command as a pure string, but this would mean not using any SqlParameters and not benefiting from the query plan optimization.
Also, I found some information about table-valued parameters, but this requires creating a table type and some overhead which I would like to avoid. This is just a list of integers after all.
Question:
Is there an easy (performant) way to do this that I am overlooking either with Entity Framework or ExecuteSqlCommand?
If not and using table-valued parameters is the best way, could you provide a complete example of how to convert an integer list into the simplest type and running that with the above query?
I've got an application which has been working fine for quite a while, but there is an annoying item that continues to get in the way on occasion.
Let's say that I use an object such as OracleDataReader or MySQLDataReader to pass the data to the sqlbulkcopy object for insert. Let's assume that all the columns maps just fine and for the most part, it all works well.
Granted, I don't have control over the source application or database (which is either MySQL or Oracle). So some goof goes into a different application and puts in a date on the invoice table of 5/31/0210. He really meant to put in 5/31/2010, but the application he's using is not validating the data very tightly and the Oracle database accepts it. For all intensive purposes, the data of 5/31/0210 is a valid date for the Oracle db. It might be stupid in terms of data entry, but it is what it is at this point.
Now our OracleDataReader comes along and is transferring this invoice table over to SQL Server via the SQLBulkCopy. It is passing the data to perfectly matched table with the right column names and data types. You can see what is going to happen. This date of 05/31/0210 from Oracle is not accepted by the SQL Server db engine, as the DATETIME field only allows dates from 1/1/1753 to 12/31/9999.
When it encounters this record, it simply fails and gives an overflow error. It doesn't skip the record, it kills the feed. So if it happens a thousand records in on a million record table, you don't get the remaining 999,000 records.
Is there anyway to get around this issue so that the feed will continue?
Ideally, I'd like to move the receiving SQL Server DB to 2008 and use DATETIME2, which would allow for these goofy dates, but unfortunately not all my clients are ready to move to this version yet, so I'm stuck with DATETIME in SQL 2000/2005/2008.
Any ideas on how to get around this without changing the SQL? Ideally, I wouldn't mind if it just skipped the record. I know that I could do this in the SQL for the datareder, but this would be extremely complicated when you have twenty date fields in a single query. It would be maintenance nightmare.
Any thoughts would be appreciated.
One option would be to change the datetime column type to varchar. Then add a derived column for converting the string to datetime. The trick would be to use a function in the derived column to validate the date and put an arbitrary datetime if the coversion will fail. If you do heavy date comparisons, persist the computed column and/or index it.
I say all of this under the impression that sqlbulkcopy is not able to do transforms. Maybe you can. Hopefully, someone will chime in with a way to.
SSIS would be great in this situation, as you could do the transform and also get the performance benefits of the bulk update lock.
I need to copy data from one database to another using a VB.NET program.
The target database is SQL Server the source database is some proprietary ODBC compliant database.
I need to loop through a list of table to copy. Read the data from the source database table for a given modified date. Delete the corresponding date from the target database table and insert the records from the source table. The databases are of the same structure i.e. table names and field names, but the data types may differ (however they are compliant e.g. double in source, float in target). No primary keys exist.
Heres how I may do it :
Firstly execute a Delete command to the target.
I could then use a DataReader to obtain data from the source, loop through the Items and create an Insert Command for each row. Add Parameters to the Command with the appropriate values and execute. And wrap the whole thing in a Transaction.
I was just wondering if I am missing a trick here. Any Suggestions
I think you should use the right for the job and I'm guessing that that is SSIS in this case, but I could be wrong and perhaps you have already explored that path.
In that case yes a datareader would do depnding how much data you have. A datatable might even be eassier and faster to program (no need to worry about datatypes since the adapter should take care of that.
The trick would be to use set based operations and not the 'row at a time' concept which we programmers were first taught :)
Here's some pseudocode
INSERT INTO DestTable (columns, columns...)
(Select ModifiedRow from SourceTable where date = Modified)
Perhaps your requirements are more complicated and may need the row by row approach, but this is normally not the case.
I'd opt to put this code in a job step and schedule on SQL. It could also be a stored procedure run from .net.
Also, using SSIS for a db to db transfer is most likely overkill unless you are going to be using some of the special transformations in there.
Take a look at the SqlBulkCopy class. If you can get the source into a DataTable or read it with an IDataReader then it's eligible. It will also attempt to convert between compatible types. See Single Bulk Copy Operations for more details.
This would be more desirable than using INSERT statements for each row.
Dim reader As System.IO.DirectoryInfo
reader = My.Computer.FileSystem.GetDirectoryInfo("c:\program Files\Microsoft SQL Server\MSSQL.1\mssql\data")
If (reader.Attributes And System.IO.FileAttributes.ReadOnly) > 0 Then
MsgBox("File is readonly!")
Else
MsgBox("Database is not read-only protected")
End If
Check all the tables first