I am migrating the SQL DMO logic to a SQL SMO logic and I am not sure how to achieve the same functionality for a few of the attributes.
This is the DMO:
oBCP = New SQLDMO.BulkCopy2
With oBCP
.ImportRowsPerBatch = ImportRowsPerBatch
.DataFilePath = Path.Combine(gtSysTempDir, "file.dat")
.LogFilePath = Path.Combine(gtSysTempDir, "file.log")
.ErrorFilePath = Path.Combine(gtSysTempDir, "file.err")
.MaximumErrorsBeforeAbort = 1
.DataFileType = SQLDMO.SQLDMO_DATAFILE_TYPE.SQLDMODataFile_TabDelimitedChar
End With
This is the SMO so far:
trans = New Transfer
With trans
.ImportRowsPerBatch = ImportRowsPerBatch
.TargetDatabaseFilePath = Path.Combine(gtSysTempDir, "file.dat")
.TargetLogFilePath = Path.Combine(gtSysTempDir, "file.log")
'.ErrorFilePath = Path.Combine(gtSysTempDir, "file.err")
.MaximumErrorsBeforeAbort = 1
.DataFileType = SQLDMO.SQLDMO_DATAFILE_TYPE.SQLDMODataFile_TabDelimitedChar
End With
What are the equivalents, if they exist, of ImportRowsPerBatch (BatchSize?), ErrorFilePath, MaxErrorsBeforeAbort, and DataFileType?
Thanks.
Despite what the MSDN says, Transfer is not actually the equivalent of BulkCopy2; it's intended for scripting/copying entire databases whereas BulkCopy2 is only for bulk copying data. The settings don't correspond at all -- in particular, Transfer.TargetDatabaseFilePath is supposed to be the location of a database file, whereas BulkCopy2.DataFilePath is the location of the file where imported/exported data goes. Some bulk copy scenarios are handled by Transfer, but most aren't.
If you're not actually copying an entire database, you have several alternatives, none of which use SMO (which has no class for bulk copying as such):
Directly invoking a BULK INSERT statement, which is probably what BulkCopy2 does in the background (but I haven't verified this);
Using the SqlBulkCopy class, which will require some extra code (but which also offers far more flexibility as the data to be imported can come from any source);
Building a command line and invoking bcp.
Of these, BULK INSERT is the most straightforward.
Related
Using Delphi 10.3.3, mainly under Windows10, I have a number of [FireDAC] TFDTable components accessing an SQL Server DB.
Everything worked fine in delphi 10.2.3 but in the Rio version, I find that those TFDtables which have an indexname set and which access a DB whose name has a space in it, fail with the message "[FireDAC]{DatS]-2 Object [PK_DEPT] is not found."
An example of the table definition in the dfm is:
object tbHODEPT: TFDTable
IndexName = 'PK_DEPT'
ConnectionName = 'HEADOFFICE'
FetchOptions.AssignedValues = [evMode]
FetchOptions.Mode = fmAll
UpdateOptions.UpdateTableName = 'DEPT'
TableName = 'DEPT'
The connection, "HEADOFFICE", is a TFDConnection descendant which just adds the credentials for the relevant DB at open-time. I have noticed that if I supply a fully qualified TableName property (e.g. '[Some DBname].dbo.[DEPT') then the problem does not occur. But setting the DBname at design time is undesirable as we allow the DB names to be configurable.
Barring a solution, my options seem to be
Renaming the problematic DB, and
Changing all of the affected TFDTables to use indexfieldnames instead. (This works OK).
Ideas welcome.
We have some large Excel files (often in excess of 100MB each) that need to be regularly brought in to a SQL Server database and I am looking for options to automate this process.
It sounds like Microsoft is no longer supporting its Businesss Intelligence Development Studio (BIDS) and has replaced it with something called SQL Server Data Tools - Business Intelligence (SSDT-BI).
Because of this, it looks like my options are SSDT-BI or an SSIS package, but I am not sure which one to use. Can someone give a little bit of information about these two solutions? Any advice for finding a solution to this problem would be appreciated.
I know this is an old post and it has nothing to do with SSIS nor SSDT but the OP was seemingly inquiring about other options that maybe available. So I wanted to add this for anyone who is looking for a easy way to import extract files without having to use either of those tools. I have created literally hundreds of processes where I'm required to import data from an extract file (CSV, Access, Excel, FoxPro, etc) the following is a Powershell snippet that will load up all sheets in an Excel Spreadsheet and then simply display the contents in a Data Grid but you should be able to easily add in your logic to import the data into a table for example. Constructive criticism is always welcome!
Clear-Host;
## You May Need to Download and Install the Microsoft Access Database Engine 2010 Redistributable: https://www.microsoft.com/en-us/download/details.aspx?id=13255
[String]$ExcelPath = "C:\Temp\TestSheet.xlsx";
[String]$TargetServer = "(local)";
[String]$TargetDatabase = "TestDB";
[String]$SourceConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;Data Source={0};Extended Properties='Excel 12.0 Xml;HDR=YES';" -f $ExcelPath;
[String]$TargetConnectionString = "Data Source={0};Initial Catalog={1};Trusted_Connection=True;" -f $TargetServer, $TargetDatabase;
$SourceFactory = [System.Data.Common.DbProviderFactories]::GetFactory("System.Data.OleDb");
$TargetFactory = [System.Data.Common.DbProviderFactories]::GetFactory("System.Data.SqlClient");
$SourceConnection = $SourceFactory.CreateConnection();
$SourceConnection.ConnectionString = $SourceConnectionString;
$SourceConnection.Open();
$SourceCommand = New-Object $SourceFactory.CreateCommand();
$SourceCommand.Connection = $SourceConnection;
$SourceCommand.CommandTimeout = [Int32]::MaxValue;
$TargetConnection = $TargetFactory.CreateConnection();
$TargetConnection.ConnectionString = $TargetConnectionString;
$TargetConnection.Open();
$TargetCommand = New-Object $TargetFactory.CreateCommand();
$TargetCommand.Connection = $TargetConnection;
$TargetCommand.CommandTimeout = [Int32]::MaxValue;
foreach($table in $SourceConnection.GetSchema("Tables").Rows){
try{
## Source
[String]$TabName = $table["TABLE_NAME"];
[String]$sqlString = "SELECT * FROM [{0}];" -f $TabName;
$SourceCommand.CommandText = $sqlString;
[System.Data.Common.DbDataReader]$SourceDataReader = $SourceCommand.ExecuteReader();
$dtData = New-Object System.Data.DataTable;
$dtData.Load($SourceDataReader);
## Target -- Bulk Insert data
if($dtData.Rows.Count -gt 0){
$TabName = "[{0}]" -f $TabName;
$sqlBulkCopy = New-Object System.Data.SqlClient.SqlBulkCopy($TargetConnection);
$sqlBulkCopy.DestinationTableName = $TabName;
foreach ($Column in $dtData.Columns){
$sqlBulkCopy.ColumnMappings.Add($column, $column)
};
$sqlBulkCopy.WriteToServer($dtData);
}
}catch{
$table["TABLE_NAME"];
$_.Exception.Message;
$_.Exception.ItemName;
};
};
#Housekeeping
$sqlBulkCopy.Close();
$sqlBulkCopy.Dispose();
$SourceCommand.Dispose();
$SourceDataReader.Dispose();
$SourceConnection.Close();
$SourceConnection.Dispose();
$TargetCommand.Dispose();
$TargetDataReader.Dispose();
$TargetConnection.Close();
$TargetConnection.Dispose();
$TargetConnection.Close();
$TargetConnection.Dispose();
[System.GC]::Collect();
The d/c b/n SSIS And SSDT not quit but SSIS is one of The Components of SSDT, Means that SSDT contains SSIS, SSRS, and SSAS.
I have the following SQL Server 2012 query:
var sql = #"Select question.QuestionUId
FROM Objective,
ObjectiveDetail,
ObjectiveTopic,
Problem,
Question
where objective.examId = 1
and objective.objectiveId = objectiveDetail.objectiveId
and objectiveDetail.ObjectiveDetailId = ObjectiveTopic.ObjectiveDetailId
and objectiveTopic.SubTopicId = Problem.SubTopicId
and problem.ProblemId = question.ProblemId";
var a = db.Database.SqlQuery<string>(sql).ToList();
Can someone help explain to me if it would be a good idea to put this into a
stored procedure and if so then how could I do that and then call it from my C# code. It was
suggested to me that if it is in a stored procedure then it would run more
efficiently as it would not be recompiled often. Is that the case?
Yes, there is. For starters, a stored procedure is precompiled and stored within your database. Being precompiled, the database engine can execute it more efficiently, since no on-the-fly compilation necessary. Also, database optimizations can be added to support a precompiled procedure. A stored procedure also allows business logic to be encapsulated within the database.
If you decide to go the stored procedure route, then consider the following:
First of all, you will need to create a stored procedure that encapsulates your existing SQL query.
CREATE PROCEDURE ListQuestionIds
#ExamId int
AS
BEGIN
SELECT Question.QuestionUId
FROM Objective
INNER JOIN ObjectiveDetail
ON ( Objective.objectiveId = ObjectiveDetail.objectiveId )
INNER JOIN ObjectiveTopic
ON ( ObjectiveDetail.ObjectiveDetailId = ObjectiveTopic.ObjectiveDetailId )
INNER JOIN Problem
ON ( ObjectiveTopic.SubTopicId = Problem.SubTopicId )
INNER JOIN Question
ON ( Problem.ProblemId = Question.ProblemId )
WHERE Objective.examId = #ExamId;
END;
Please make sure that the tables called by your procedure (Objective, Problem, etc,) have all of the relevant primary keys and indexes in place to enhance the performance of your query.
Next, you will need to call that stored procedure from within your C# code. One way--but by no means the only way--is to create a connection to your database using the SqlConnection object and then executing your procedure via the SqlCommand object.
I would recommend that you take a look at How to execute a stored procedure within C# program for some on-topic examples. But a simple example of such might look like:
string connectionString = "your_connection_string";
using (var con = new SqlConnection(connectionString))
{
using (var cmd = new SqlCommand("ListQuestionIds", con)) {
cmd.CommandType = CommandType.StoredProcedure
cmd.Parameters.Add(new SqlParameter("#ExamId", examId))
con.Open();
using (SqlDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
// Loop through the returned SqlDataReader object (aka. rdr) and
// then evaluate & process the returned question id value(s) here
}
}
}
}
Please note that this sample code does not (intentionally) include any error handling. I leave that up to you to integrate into your application.
Finally, just as an FYI... many of the more modern ORMs (e.g., Entity Framework, NHibernate, etc.) allow you to execute stored procedure-like queries from your C# code without requiring an explicit stored procedure. If you are already using an ORM in your application, then you may want to forgo the stored procedure altogether. Whatever you decide to do, a little research on your end will help you make an informed decision.
I hope this helps you get started. Good luck.
i have a console app in c# that extracts 20 fields from an oracle DB witht he code below and i wanted an efficient way to insert them into SQL 2005.
i dotn want to insert each one of the 20,000 within the while loop, obviously. i was thinking to change the code to use a data set to cache all the records and then do a bulk insert...
thoughts?
pseudo code would be nice since i am new to oracle.
this is my code where i was testing getting a connection to oracle and seeing if i can view the data... now i can view it i want to get it out and into sql2005... what do i do from here?
static void getData()
{
string connectionString = GetConnectionString();
using (OracleConnection connection = new OracleConnection())
{
connection.ConnectionString = connectionString;
connection.Open();
OracleCommand command = connection.CreateCommand();
string sql = "SELECT * FROM BUG";
command.CommandText = sql;
OracleDataReader reader = command.ExecuteReader();
while (reader.Read())
{
//string myField = (string)reader["Project"];
string myField = reader[0].ToString();
Console.WriteLine(myField);
}
}
}
You can create a CSV file and then use BULK INSERT to insert the file into SQL Server. Have a look here for an example.
The "bulk" insert with the cached Dataset will work exactly like the while loop you are not wanting to write! The problem is that you'll lose control of the process if you try to use the "bulk" insert of the Dataset class. It is extraneous work in the end.
Maybe the best solution is to use a DataWriter so that you have complete control and no Dataset overhead.
You can actually do 100-1000 inserts per sql batch. Just generate multiple inserts, then submit. Pregenerate the next SELECT batch WHILE THE FIRST EXECUTES.
I am new to PowerBuilder.
I want to retrieve the data from MSAccess tables and update it to corresponding SQL tables. I am not able to create a permanent DSN for MSAccess because I have to select different MSAccess files with same table information. I can create a permanent DSN for SQL server.
Please help me to create DSN dynamically when selecting the MSAccess file and push all the tables data to SQL using PowerBuilder.
Also give the full PowerBuilder code to complete the problem if its possible.
In Access we strongly suggest not using DSNs at all as it is one less thing for someone to have to configure and one less thing for the users to screw up. Using DSN-Less Connections You should see if PowerBuilder has a similar option.
Create the DSN manually in the ODBC administrator
Locate the entry in the registry
Export the registry syntax into a .reg file
Read and edit the .reg file dynamically in PB
Write it back to the registry using PB's RegistrySet ( key, valuename, valuetype, value )
Once you've got your DSN set up, there are many options to push data from one database to the other.
You'll need two transaction objects in PB, each pointing to its own database. Then, you could use a Data Pipeline object to manage the actual data transfer.
You want to do the DSNLess connection referenced by Tony. I show an example of doing it at PBDJ and have a code sample over at Sybase's CodeXchange.
I am using this code, try it!
//// Profile access databases accdb format
SQLCA.DBMS = "OLE DB"
SQLCA.AutoCommit = False
SQLCA.DBParm = "PROVIDER='Microsoft.ACE.OLEDB.12.0',DATASOURCE='C:\databasename.accdb',DelimitIdentifier='No',CommitOnDisconnect='No'"
Connect using SQLCA;
If SQLCA.SQLCode = 0 Then
Open ( w_rsre_frame )
else
MessageBox ("Cannot Connect to Database", SQLCA.SQLErrText )
End If
or
//// Profile access databases mdb format
transaction aTrx
long resu
string database
database = "C:\databasename.mdb"
aTrx = create transaction
aTrx.DBMS = "OLE DB"
aTrx.AutoCommit = True
aTrx.DBParm = "PROVIDER='Microsoft.Jet.OLEDB.4.0',DATASOURCE='"+database+"',PBMaxBlobSize=100000,StaticBind='No',PBNoCatalog='YES'"
connect using aTrx ;
if atrx.sqldbcode = 0 then
messagebox("","Connection success to database")
else
messagebox("Error code: "+string(atrx.sqlcode),atrx.sqlerrtext+ " DB Code Error: "+string(atrx.sqldbcode))
end if
// do stuff...
destroy atrx