C# Filestream to SQL Server database - sql-server

I want to create a file in SQL Server from a string. I can't figure out how to put it into the database. After reading it seems it has someting to do with filestream. If so then once the stream is created then how do I put that to my DB as a file?
FileStream fs1 = new FileStream("somefilename", FileMode.Create, FileAccess.Write);
StreamWriter writer = new StreamWriter(fs1);
writer.WriteLine("file content line 1");
writer.Close();
What I am trying to achieve is create a file from a string. I believe that my db is already set up for files. As we have a savefile method that works:
HttpPostedFile file = uploadedFiles[i];
if (file.ContentLength < 30000000)
{
//DOFileUpload File = CurrentBRJob.SaveFile(CurrentSessionContext.Owner.ContactID, Job.JobID, fileNew.PostedFile);
DOFileUpload File = CurrentBRJob.SaveFile(CurrentSessionContext.Owner.ContactID, Job.JobID, file, file.ContentLength, CurrentSessionContext.CurrentContact.ContactID);
DOJobFile jf = CurrentBRJob.CreateJobFile(CurrentSessionContext.Owner.ContactID, Job.JobID, File.FileID);
CurrentBRJob.SaveJobFile(jf);
}
What I want to do is: Instead of the user selecting a file for us to save to the DB. I want to instead create that file internally with strings and then save it to the db.

Create a a column type of any one below. Use ADO.NET SqlCommand write it to database.
varbinary(max) - to write binary data
nvarchar(max) - for unicode text data (i mean if text involves UNICODE chars)
varchar(max) - for non unicode text data

Related

pyodbc - Image from Microsoft SQL Server writing in binary mode error

Hopefully I'm missing something obvious here...
Whenever I try and use pyodbc to convert and save images from a Microsoft SQL Server, I generate files that are not recognised as images files.
Python code:
cursor = conn.cursor()
q = "SELECT top 1 * from reactorA.dbo.ProcessedImageData"
records = cursor.execute(q).fetchall()
for row in records:
data = row.AnalysedImage
with open('test.bmp','wb') as f:
f.write(data)
When I open the newly created image file it's just repeating ÿ symbols
When I view the record it seems to be in the correct bytes format
b'\xff\xff\xff .......
Any help on fixing this will be greatly appreciated.

Save base64 encoded string image data in Image type column in SQL Server with vbscript

I have a column named IMGDATA of type Image in my MyImages in SQL Server server
I have a string variable named MyImageStringBase64Encoded that contains a base64 encoded image. I m trying to save this image to the database
Set Command1 = Server.CreateObject ("ADODB.Command")
Command1.ActiveConnection = MyConnection_STRING
Command1.CommandText = "INSERT INTO MyImages (IMGNAME,IMGDATA) VALUES (?,?)"
Command1.Parameters(0) = "My Image Name"
Command1.Parameters(1) = MyImageStringBase64Encoded
Command1.CommandType = 1
Command1.Execute()
The code above will save corrupted image data in the db. Maybe I have to save bytes?
possibly related but not sure how to apply them
https://stackoverflow.com/a/24925145/934967
MS Access: Sending image as varbinary to stored procedure
I 'm new to vbscript

finding latest excel file from folder using ssis

I have group of excel files in a folder. excel file name will be like
ABC 2014-09-13.xlsx
ABC 2014-09-14.xlsx
ABC 2014-09-15.xlsx
I need to get the data from latest excel file and load it into the table using ssis package.
This may not be the shortest answer, but will help you.
Steps:
Create a For-each loop, to fetch all the excel sheets. Insert all the excel sheet names to a table.
Create a variable. Assign its value as the MAX() among Excel dates.
Add a 2nd Fore-each loop. Just like the 1st loop, pick all the excel sheets 1 by 1, compare each file name with Variable value. Load the excel which matches it.
As this is duplicate question, I will put answer anyway with some changes or additional info.
You should have created table for excel to import and added Connection Manager into package.
Create 2 variables MainDir, where excel files exists, and ExcelFile to hold last file full name.
Add Script Task to package. Open it and in the Script tab add ReadOnlyVariables = User::MainDir and ReadWriteVariables = User::ExcelFile
Press Edit Script... button and in the new window paste this code:
into Main
string fileMask = "*.xlsx";
string mostRecentFile = string.Empty;
string rootFolder = string.Empty;
rootFolder = Dts.Variables["User::MainDir"].Value.ToString();
System.IO.DirectoryInfo directoryInfo = new System.IO.DirectoryInfo(rootFolder);
System.IO.FileInfo mostRecent = null;
System.IO.FileInfo[] legacyArray = directoryInfo.GetFiles(fileMask, System.IO.SearchOption.TopDirectoryOnly);
Array.Sort(legacyArray, (f2, f1) => f2.Name.CompareTo(f1.Name));
mostRecent = legacyArray[legacyArray.Length - 1];
if (mostRecent != null)
{
mostRecentFile = mostRecent.FullName;
}
Dts.Variables["User::ExcelFile"].Value = mostRecentFile;
Dts.TaskResult = (int)ScriptResults.Success;`
Create Excel Connection Manager and in the Edit mode select Excel file path to some excel, Excel version and if needed keep First row has column names checked.
In the properties of Excel Connection Manager find Expressions and add Property ExcelFilePath with value #[User::ExcelFile]
Put Data Flow Task, connect with Script task.
Add Excel Source into Data Flow Task. Open editor. Select Excel Connection Manager you created before, Data access mode change to SQL command and add this line (make sure, that excel file sheet name is Sheet1): SELECT * FROM [Sheet1$]. Also check if all necessary columns selected in Columns tab.
The last component is OLE DB Destination, which you must connect with Excel Source component. Add connection manager, select table and mappings to table you want to insert.
That's all you need to do to insert excel...

How can I import a PostgreSQL .pgc file into a Postgres DB

I am expecting a data set to be supplied for data migration into a new system. The legacy vendor has supplied me with a .pgc file.
What is this? Is this a data file? Google tells me its an embedded SQL Program.
How can I import this to my local Postgres DB to get at the data set?
The output of command file filename.pgc is
file energyresourcingprod.pgc
energyresourcingprod.pgc: PostgreSQL custom database dump - v1.12-0
The first few lines from text editor are:
PGDMPrenergyresourcingprod9.2.49.2.4∑±00ENCODINGENCODINGSET client_encoding = 'UTF8';
false≤00
STDSTRINGS
STDSTRINGS)SET standard_conforming_strings = 'off';
false≥126214581287energyresourcingprodDATABASErCREATE DATABASE energyresourcingprod WITH TEMPLATE = template0 ENCODING = 'UTF8' LC_COLLATE = 'C' LC_CTYPE = 'C';
$DROP DATABASE energyresourcingprod;
carerixfalse26152200publicSCHEMACREATE SCHEMA public;
DROP SCHEMA public;
The file is 300Mb and the majority of it contains hashed/base64? content:
ßû+˜)™yä⁄%(»j9≤\§^¸S∏Cîó|%ëflsfi∆†p1ñºúíñ Í∆î≈3õµ=qn
Mµ¢©]Q,uÆ<*Å™ííP’ÍOõ…∫U1Eu͡ IîfiärJ¥›•$ø...
...
Many Thanks
It's just a plain PostgreSQL dump.
Use pg_restore to load it into a database.
It's weird that they used that filename, but ultimately insignificant.

C# 20,000 records selected from oracle need to be inserted into SQL2005

i have a console app in c# that extracts 20 fields from an oracle DB witht he code below and i wanted an efficient way to insert them into SQL 2005.
i dotn want to insert each one of the 20,000 within the while loop, obviously. i was thinking to change the code to use a data set to cache all the records and then do a bulk insert...
thoughts?
pseudo code would be nice since i am new to oracle.
this is my code where i was testing getting a connection to oracle and seeing if i can view the data... now i can view it i want to get it out and into sql2005... what do i do from here?
static void getData()
{
string connectionString = GetConnectionString();
using (OracleConnection connection = new OracleConnection())
{
connection.ConnectionString = connectionString;
connection.Open();
OracleCommand command = connection.CreateCommand();
string sql = "SELECT * FROM BUG";
command.CommandText = sql;
OracleDataReader reader = command.ExecuteReader();
while (reader.Read())
{
//string myField = (string)reader["Project"];
string myField = reader[0].ToString();
Console.WriteLine(myField);
}
}
}
You can create a CSV file and then use BULK INSERT to insert the file into SQL Server. Have a look here for an example.
The "bulk" insert with the cached Dataset will work exactly like the while loop you are not wanting to write! The problem is that you'll lose control of the process if you try to use the "bulk" insert of the Dataset class. It is extraneous work in the end.
Maybe the best solution is to use a DataWriter so that you have complete control and no Dataset overhead.
You can actually do 100-1000 inserts per sql batch. Just generate multiple inserts, then submit. Pregenerate the next SELECT batch WHILE THE FIRST EXECUTES.

Resources