Can SqlBulkCopy be used with a SQL Server Compact database? - sql-server

I have a live database and a local database and I want to copy a large amount of data in my live database using (SqlClient) to my local database (SqlServerCe - SQL Server CE database). How can I do that?

Yes, I have a SqlCeBulkCopy library/Nuget package that you can use:
https://github.com/ErikEJ/sqlcebulkcopy
Sample code:
using ErikEJ.SqlCe;
private static void DoBulkCopy(bool keepNulls, IDataReader reader)
{
SqlCeBulkCopyOptions options = new SqlCeBulkCopyOptions();
if (keepNulls)
{
options = options |= SqlCeBulkCopyOptions.KeepNulls;
}
using (SqlCeBulkCopy bc = new SqlCeBulkCopy(connectionString, options))
{
bc.DestinationTableName = "tblDoctor";
bc.WriteToServer(reader);
}
}

Related

EF4: First SaveChanges to remote server takes ~15 min

I have a small app, that writes some basic data to sql server (2019) on remote server.
When trying to save the first record SaveChanges is hanging for ~15 minutes, then it succeeds and continue writing the next record with no delay.
In this code I try to write single value to configuration table (columns: name, value)
internal static bool Insert(string Name, string value)
{
dbConfTable currentObj = null;
using (myDBEntities db = new myDBEntities(ConnectionString))
{
dbConfTable currentObj = db.Configurations.Create();
currentObj.Value = value;
currentObj.Name = Name;
db.Configurations.Add(currentObj);
db.SaveChanges();
}
return true;
}
Things I tried:
Pre-generate the view - did not help
SaveChanges once with no changes - Save with no changes succeeded, EFhangs
Write directly to sql using SqlCommand and then accessing using EF- SqlCommand succeeded, EF hangs
Write to local sql server - works correctly
Any idea?

Getting error while trying to connect to SQL Server Analysis Service thru ADOMD

Error:
Either the user, 'myName\user', does not have access to the 'Sample' database, or the database does not exist.
I have the Sample database in SQL Server and also sample cube in the Analysis Server, however I'm getting the error while trying to run the below code which is just for checking the connection.
AdomdConnection conn = new AdomdConnection(#"Data Source=myName\MSSQLSERVER16;Catalog=Sample");
AdomdCommand cmd = new AdomdCommand("SELECT NON EMPTY { [Measures].[Sales Count] } ON COLUMNS FROM [Sample] CELL PROPERTIES VALUE");
AdomdDataReader rdr;
int count = 0;
conn.Open();
rdr = cmd.ExecuteReader();
if (rdr.Read())
{
while (rdr.Read())
{
count++;
}
}
conn.Close();
Console.WriteLine("Count: " + count);
Is there anything wrong in my code? or, it is about the security/access issue. However, I have added the myNmae\user as server administrator at Security of Microsoft Analysis Server. May I get some help please.
I ve solved it, you need to add yourself/user in the Analysis Service thru property > security to get access the db. (also, another mistake was using the db of sql server in connection string instead of Analysis server db). Thanks!

SQL Server Output multiple CSV files from one query

I am trying to get SQL server to create multiple CSV files from one query. At this time we have 7 separate long running (2+ hours each) queries that need to be output to separate CSV files. I can create one query that calls all of them but that generates one giant CSV. Is there a way to tell SQL Server to create a separate CSV after each internal query has completed?
This question becomes more important as our next production run will have 52 of those long running queries and my boss does not want to have to run each of them separately.
EXEC dbo.Get_Result_Set1;
EXEC dbo.Get_Result_Set2;
EXEC dbo.Get_Result_Set3;
EXEC dbo.Get_Result_Set4;
EXEC dbo.Get_Result_Set5;
EXEC dbo.Get_Result_Set6;
EXEC dbo.Get_Result_Set7;
You may want to create an SSIS package as the wrapper around executing these stored procedures, rather than your current query.
Each stored procedure can then be linked to a flat-file output.
This has the advantage that you should be able to express any required dependencies between the different invocations and (if possible) exploit some parallelism (if there are no dependencies between some/all of the invocations).
Could you create an Agent Job to do it? You could do a separate job step for each one of the queries. Under the advanced tab in the step section, there is an output option.
Not the answer I was looking for but I wound up having someone help me write a C# procedure in Visual Studio that calls each of my SQL procedures and outputs each to an Excel file. It works and I can reuse it in the future.
using System.Collections.Generic;
using System.Configuration;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using System.Linq;
using System.Text;
namespace StoredProcedureRunner
{
class Program
{
public static void Main(string[] args)
{
var storedProcs = new List<string>();
storedProcs.Add "dbo.Get_Result_Set1");
storedProcs.Add "dbo.Get_Result_Set2");
storedProcs.Add "dbo.Get_Result_Set3");
storedProcs.Add "dbo.Get_Result_Set4");
storedProcs.Add "dbo.Get_Result_Set5");
storedProcs.Add "dbo.Get_Result_Set6");
storedProcs.Add "dbo.Get_Result_Set7");
foreach (var storedProc in storedProcs)
{
var table = GetDataTable(storedProc);
WriteDataTableToCSV(storedProc + ".csv", table);
}
}
public static DataTable GetDataTable(string storedProc)
{
DataTable table = new DataTable();
using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["ConStrg"].ConnectionString))
{
using (var command = new SqlCommand(storedProc, connection))
{
using (var adapter = new SqlDataAdapter(command))
{
command.CommandType = CommandType.StoredProcedure;
command.CommandTimeout = 0
adapter.Fill(table);
}
}
}
return table;
}
public static void WriteDataTableToCSV(string filename, DataTable table)
{
StringBuilder sb = new StringBuilder();
var columnNames = table.Columns.Cast<DataColumn>().Select(col => col.ColumnName);
sb.AppendLine(string.Join(",", columnNames));
foreach(DataRow row in table.Rows)
{
var fields = row.ItemArray.Select(field => field.ToString());
sb.AppendLine(string.Join(",", fields));
}
File.WriteAllText(filename, sb.ToString());
}
}
}

connection string without server name

I am developing a c# application with sql server express database that should be run in a local network. I want to make a setup for my project by InstallAware.
I want to know how to set connection string for clients while I don't know the server name, in the other hand I want to connect to database only knowing InstanceName.
ConnectionString = #"Data Source=ServerName\InstanceName;Initial Catalog=Accounting;Persist Security Info=True;User ID=sa;Password=password";
public static string GetServerName()
{
// https://msdn.microsoft.com/en-us/library/a6t1z9x2%28v=vs.110%29.aspx?f=255&MSPPError=-2147217396
DataTable dt = SqlDataSourceEnumerator.Instance.GetDataSources();
DataRow[] dr = dt.Select("InstanceName='myInstanceName'");
if (dr.Length == 0)
return null;
return dr[0]["ServerName"].ToString();
}

Sql server restore stuck in "restoring" state

If I restore from Sql Server no problem, but if I do it through my application, the database is stuck in "restoring state".
I found some advice saying to put noRecovery = false, but this didn't change anything.
If I remove the "with move" option, it works: after the restore the DB is in a normal state.
The thing that I would like to understand is: does "with move" modify a sql server table?
Because if I launch the restore the first time without "with move" it says that he could not find the specified path. Otherwise, if I launch the restore with this option, and then one second time without it, it works. So there must be some tables that sql server uses to map the logical name with a physical path, how can I modify this table?
Here is the code:
SqlConnection sqlConnection = new SqlConnection(string.Format("Data Source={0};Initial Catalog={1};Integrated Security=True", database.SqlServerId, database.Name));
ServerConnection connection = new ServerConnection(sqlConnection);
Server sqlServer = new Server(connection);
Restore rstDatabase = new Restore();
rstDatabase.Action = RestoreActionType.Database;
rstDatabase.Database = backupFile.Name;
BackupDeviceItem bkpDevice = new BackupDeviceItem(backupFile.FileName, DeviceType.File);
rstDatabase.Devices.Add(bkpDevice);
rstDatabase.ReplaceDatabase = true;
rstDatabase.NoRecovery = false;
string dbLogicalName = "";
string logLogicalName = "";
sqlConnection.Open();
SqlCommand command = new SqlCommand(string.Format("RESTORE FILELISTONLY FROM DISK = '{0}'", backupFile.FileName), sqlConnection);
SqlDataReader reader = command.ExecuteReader();
if (reader.HasRows)
{
while (reader.Read())
{
if (reader.GetString(2) == "D")
dbLogicalName = reader.GetString(0);
if (reader.GetString(2) == "L")
logLogicalName = reader.GetString(0);
}
}
reader.Close();
rstDatabase.RelocateFiles.Add(new RelocateFile(dbLogicalName, backupFile.DatabaseFile));
rstDatabase.RelocateFiles.Add(new RelocateFile(logLogicalName, backupFile.LogsFile));
//Restore
rstDatabase.SqlRestore(sqlServer);
rstDatabase.Devices.Remove(bkpDevice);
sqlConnection.Close();
connection.Disconnect();
The thing that I would like to understand is: does "with move" modify a sql server table?
Not necessarily, unless you modify the default databases path in the instance, which you could do in SSMS following these steps:
https://learn.microsoft.com/en-us/sql/database-engine/configure-windows/view-or-change-the-default-locations-for-data-and-log-files
But as someone said, try to fix that on your own application. Do not attempt to modify this on the system database. It could bring unexpected results if you do that.
I think you know that, but WITH MOVE tells the SQL Server to reallocate the restored database in different path than the default path.

Resources