SqlServer Converting XML to varbinary and parsing it in .NET (C#) - sql-server

Consider the following code:
[Test]
public void StackOverflowQuestionTest()
{
const string connectionString = "enter your connection string if you wanna test this code";
byte[] result = null;
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
using (var sqlCommand = new SqlCommand("declare #xml as xml = '<xml/>' SELECT convert(varbinary(max), #xml) as value"))
//using (var sqlCommand = new SqlCommand("SELECT convert(varbinary(max), N'<xml/>') as value"))
{
sqlCommand.Connection = connection;
using (SqlDataReader reader = sqlCommand.ExecuteReader())
{
while (reader.Read())
{
result = (byte[])reader["value"];
}
reader.Close();
}
}
}
string decodedString = new UnicodeEncoding(false, true).GetString(result);
var document = XElement.Parse(decodedString);
}
If I run this test I get an XmlException with message : "Data at the root level is invalid. Line 1, position 1." As it turns out the problem is "0xFFFE" preamble which is considered as invalid character.
Note that if I use commented string instead, everything works just fine, which is strange as per me. Looks like SqlServer stores XML strings in UCS-2 with a BOM, and at the same time it stores nvarchar values without it.
The main question is: how can I decode this byte array to string which will not contain this preamble (BOM)?

In case anyone will need this in future, the following code works:
using(var ms = new MemoryStream(result))
{
using (var sr = new StreamReader(ms, Encoding.Unicode, true))
{
decodedString = sr.ReadToEnd();
}
}

Related

Exporting all table in SQL Server to .text using C#

I have an application that trying to extract all data in different table with 1 Database. First, I stored all the query in a .txt file to retrieve the table name and stored it in List.
[Here's my .txt file]
string script = File.ReadAllText(#"D:\Schooldb\School.txt");
List<string> strings = new List<string>();
strings.Add(script);
using (SqlConnection connection = new SqlConnection(constring))
{
foreach (string x in strings)
{
using (SqlCommand cmd = new SqlCommand(x, connection))
{
using (SqlDataAdapter adapter = new SqlDataAdapter())
{
cmd.Connection = connection;
adapter.SelectCommand = cmd;
using (DataTable dt = new DataTable())
{
adapter.Fill(dt);
string txt = string.Empty;
foreach (DataColumn column in dt.Columns)
{
//Add the Header row for Text file.
txt += column.ColumnName + "\t\t";
}
//Add new line after Column Name.
txt += "\r\n";
foreach (DataRow row in dt.Rows)
{
foreach (DataColumn column in dt.Columns)
{
//Add the Data rows.
txt += row[column.ColumnName].ToString() + "***";
}
//Add new line.
txt += "\r\n";
}
int y = 0;
StreamWriter file = new StreamWriter($#"D:\SchoolOutput\{x}_{DateTime.Now.ToString("yyyyMMdd")}.txt");
file.WriteLine(txt.ToString());
file.Close();
y++;
}
}
}
}
Expected:
teachers_datetoday
students_datetoday
subjects_datetoday
But reality my output is just
datetoday txt
Can someone tell me, where part did I go wrong?
Thanks in advance!
There are other approaches for extracting data directly using SSMS.
In this case, your code reads the entire text as a single string, and the for loop runs only once.
Instead of reading the entire file as a string, you can have each line as one command and read the commands like the following.
foreach (string line in System.IO.File.ReadLines(#"D:\Schooldb\School.txt"))
{
//Each line contains one command
//Write your logic here
}

Unexpected end of JSON input: asp.net core

when select query
select * from Permissions where AppId='yZVwUoxKQCu' FOR JSON PATH,INCLUDE_NULL_VALUES
executes in sql server direct it return full data with array wrapper without unexpected end.when the query read with DataReader in C# it the same query result have unexpected end. the sql execute using below the method.
Query like this
var sql = $"select * from Permissions where AppId='{AppId}' FOR JSON PATH,INCLUDE_NULL_VALUES";
var res = Connection.ExecuteScalarCommand(sql);
public static String ExecuteScalarCommand(string sql)
{
string CS = DbConnectionString;
SqlConnection con = new SqlConnection(CS);
string val = "";
try
{
con.Open();
SqlCommand cmd = new SqlCommand(sql, con);
val = cmd.ExecuteScalar().ToString();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message.ToString());
}
finally
{
con.Close();
}
return val;
}
The output given by sql server direct is given.its expected output.
[{"Id":49,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":50,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":51,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":52,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":53,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":54,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":55,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"},{"Id":56,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":57,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":58,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":59,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":60,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":61,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":62,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"},{"Id":42,"IsCreatable":0,"IsViewable":0,"IsDeletable":1,"IsUpdatable":0,"RoleId":28,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":43,"IsCreatable":1,"IsViewable":0,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":44,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":45,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":46,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":47,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":28,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":48,"IsCreatable":0,"IsViewable":0,"IsDeletable":1,"IsUpdatable":0,"RoleId":28,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"}]
but out put from asp.net core is
[{"Id":49,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":50,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":51,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":52,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":53,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":54,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"User","UserId":60,"PermissionTitleId":6,"AppId":"yZVwUoxKQCu"},{"Id":55,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":29,"Title":"Roles","UserId":60,"PermissionTitleId":7,"AppId":"yZVwUoxKQCu"},{"Id":56,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Season","UserId":60,"PermissionTitleId":1,"AppId":"yZVwUoxKQCu"},{"Id":57,"IsCreatable":0,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Categories","UserId":60,"PermissionTitleId":2,"AppId":"yZVwUoxKQCu"},{"Id":58,"IsCreatable":2,"IsViewable":2,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Tabs","UserId":60,"PermissionTitleId":3,"AppId":"yZVwUoxKQCu"},{"Id":59,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"Fields","UserId":60,"PermissionTitleId":4,"AppId":"yZVwUoxKQCu"},{"Id":60,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"ContentBlock","UserId":60,"PermissionTitleId":5,"AppId":"yZVwUoxKQCu"},{"Id":61,"IsCreatable":1,"IsViewable":1,"IsDeletable":1,"IsUpdatable":1,"RoleId":30,"Title":"User","UserId":60,"PermissionT
The ExecuteScalar has a character limit:
The first column of the first row in the result set, or a null
reference (Nothing in Visual Basic) if the result set is empty.
Returns a maximum of 2033 characters.
To overcome that you will need to use a different function, like ExecuteReader.

SqlDataReader doesn't show return rows even when stored procedure does

I am trying to get data using a SqlDataReader. When I look at the result, I do see the number of columns but rows are missing and I am seeing only blanks with "the enumeration yielded no results".
After spending much time and going through similar issues on the net I couldn't resolve my issue. Then I tried the same stored procedure but used the DataAdapter. Using a DataAdapter, all the rows were returned. I am not sure what is causing the issue. I tried to rewrite my methods differently but no avail.
public static SqlDataReader ExecuteDataReader(string connection, string storedProcedure)
{
using (var sqlCon = new SqlConnection(connection))
{
// sqlCon.Open();
var cmd = new SqlCommand(storedProcedure, sqlCon)
{
CommandType = CommandType.StoredProcedure
};
sqlCon.Open();
using (var reader = cmd.ExecuteReader())
{
var count = reader.FieldCount;
while (reader.Read())
{
for (var i = 0; i < count; i++)
{
Console.WriteLine(reader.GetValue(i));
}
}
}
//reader.Close();
return null; // for testing purpose
}
}
--stored procedure. (kept it simple)
Create PROC [dbo].GetTheData
As
select '1' AS one,'2' as two,'3' as three,'4' as four,'5'as five

In NpgSql Insert bit datatype using BeginBinaryImport for bulk data insertion

I have been trying to implement bulk insert operation for postgre database using Npgsql version 3.1.2, but I had facing one issue ('insufficient data left in message ')
regarding datatype miss match for the column paymentdone(bit(1)) datatype in postgre table. I had try with bool, char, integer datatype (C#) but that is also got same error.
Code For bulk data insertion
public void BulkInsert(string connectionString, DataTable dataTable)
{
using (var npgsqlConn = new NpgsqlConnection(connectionString))
{
npgsqlConn.Open();
var commandFormat = string.Format(CultureInfo.InvariantCulture, "COPY {0} {1} FROM STDIN BINARY", "logging.testtable", "(firstName,LastName,LogDateTime,RowStatus,active,id,paymentdone)");
using (var writer = npgsqlConn.BeginBinaryImport(commandFormat))
{
foreach (DataRow item in dataTable.Rows)
{
writer.WriteRow(item.ItemArray);
}
}
npgsqlConn.Close();
}
}
DataTable Function
private static void BulkInsert()
{
DataTable table = new DataTable();
table.Columns.Add("firstName", typeof(String));
table.Columns.Add("LastName", typeof(String));
table.Columns.Add("LogDateTime", typeof(DateTime));
table.Columns.Add("RowStatus", typeof(int));
table.Columns.Add("active", typeof(bool));
table.Columns.Add("id", typeof(long));
table.Columns.Add("paymentdone", typeof(bool));
var dataRow = table.NewRow();
dataRow[0] = "Test";
dataRow[1] = "Temp";
dataRow[2] = DateTime.Now;
dataRow[3] = 1;
dataRow[4] = true;
dataRow[5] = 10;
dataRow[6] = true;
table.Rows.Add(dataRow);
BulkInsert(ConfigurationManager.ConnectionStrings["StoreEntities"].ConnectionString, table);
}
This is probably happening because when Npgsql sees a boolean, its default is to sent a PostgreSQL boolean and not a BIT(1). When using binary COPY, you must write exactly the types PostgreSQL expects.
One solution is probably to use .NET BitArray instead of boolean. Npgsql will infer PostgreSQL BIT() from that type and everything should work.
But a safer solution is simply to call StartRow() and then to use the overload of Write() which accepts an NpgsqlDbType. This allows you to unambiguously specify which PostgreSQL type you want to send.

SqlServer better to batch statements or foreach?

Hypothetically, is it better to send N statements to Sql Server (2008), or is it better to send 1 command comprising N statements to Sql Server? In either case, I am running the same statement over a list of objects, and in both cases I would be using named parameters. Suppose my use case is dumping a cache of log items every few hours.
foreach example
var sql = "update blah blah blah where id = #id";
using(var conn = GetConnection())
{
foreach(var obj in myList)
{
var cmd = new SqlCommand()
{CommandText = sql, Connection = conn};
//add params from obj
cmd.ExecuteNonQuery();
}
}
batch example
var sql = #"
update blah blah blah where id = #id1
update blah blah blah where id = #id2
update blah blah blah where id = #id3
-etc";
using (var conn = GetConnection())
{
var cmd = new SqlCommand
{ CommandText = sql, Connection = conn};
for(int i=0; i<myList.Count; i++)
{
//add params: "id" + i from myList[i]
}
cmd.ExecuteNonQuery();
}
In time tests, the batch version took 15% longer than the foreach version for large inputs. I figure the batch version takes longer to execute because the server has to parse a huge statement and bind up to 2000 parameters. Supposing Sql Server is on the LAN, is there any advantage to using the batch method?
Your tests would seem to have given you the answer however let me add another. It is preferrable to encapsulate the update into a separate function and call that using a foreach:
private function UpdateFoo( int id )
{
const sql = "Update Foo Where Id = #Id";
using ( var conn = GetConnection() )
{
using ( var cmd = new SqlCommand( sql, conn ) )
{
cmd.AddParameterWithValue( "#Id", id )
cmd.ExecuteNonQuery();
}
}
}
private function UpdateLotsOfFoo()
{
foreach( var foo in myList )
{
UpdateFoo( foo.Id );
}
}
In this setup you are leveraging connection pooling which mitgates the cost of opening and closing connections.
#Thomas - this design can increase overhead of opening / closing connections in a loop. This is not a preferred practice and should be avoided. The code below allows the iteration of the statements while using one connection and will be easier on resources (both client and server side).
private void UpdateFoo(int id)
{
const string sql = "Update Foo Where Id = #Id";
using (var conn = GetConnection())
{
conn.Open();
foreach (var foo in myList)
{
UpdateFoo(foo.Id);
using (var cmd = new SqlCommand(sql, conn))
{
cmd.AddParameterWithValue("#Id", id);
cmd.ExecuteNonQuery();
}
}
conn.Close();
}
}

Resources