Method to merge SPFile into Existing SPFile - file

My problem is that I will have 1..n files to merge into a single file. I have a target SPFile on the Sharepoint Server. I have written a method based on web sites about OpenXML and it runs without errors but ends up blank when I review it.
Here is the Method
private void InsertSPFileInto(SPFile target, SPFile source, int index)
{
Stream targetStream = target.OpenBinaryStream();
using (WordprocessingDocument myDoc = WordprocessingDocument.Open(targetStream, true))
{
string altChunkId = "AltChunkId" + index.ToString();
MainDocumentPart mainPart = myDoc.MainDocumentPart;
AlternativeFormatImportPart chunk =
mainPart.AddAlternativeFormatImportPart(
AlternativeFormatImportPartType.WordprocessingML,
altChunkId);
Stream sourceStream = source.OpenBinaryStream();
chunk.FeedData(sourceStream);
AltChunk altChunk = new AltChunk();
altChunk.Id = altChunkId;
mainPart.Document
.Body
.InsertAfter(altChunk,mainPart.Document.Body.LastChild);
mainPart.Document.Save();
}
}
Again it just returns a blank document, but it doesn't corrupt it either.
Thanks.
Tim Daniels

After playing around with it I found the problem I need to add the follow three lines to the method after the line "maindocument.Documents.Save()"
targetDoc.Close();
targetStream.Flush();
target.SaveBinary(targetStream);

Related

Gmail API .NET: Get full message

How do I get the full message and not just the metadata using gmail api?
I have a service account and I am able to retrieve a message but only in the metadata, raw and minimal formats. How do I retrieve the full message in the full format? The following code works fine
var request = service.Users.Messages.Get(userId, messageId);
request.Format = UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
Message message = request.Execute();
However, when I omit the format (hence I use the default format which is FULL) or I change the format to UsersResource.MessagesResource.GetRequest.FormatEnum.Full
I get the error: Metadata scope doesn't allow format FULL
I have included the following scopes:
https://www.googleapis.com/auth/gmail.readonly,
https://www.googleapis.com/auth/gmail.metadata,
https://www.googleapis.com/auth/gmail.modify,
https://mail.google.com/
How do I get the full message?
I had to remove the scope for the metadata to be able to get the full message format.
The user from the SO post have the same error.
Try this out first.
Go to https://security.google.com/settings/security/permissions
Choose the app you are working with.
Click Remove > OK
Next time, just request exactly which permissions you need.
Another thing, try to use gmailMessage.payload.parts[0].body.dataand to decode it into readable text, do the following from the SO post:
import org.apache.commons.codec.binary.Base64;
import org.apache.commons.codec.binary.StringUtils;
System.out.println(StringUtils.newStringUtf8(Base64.decodeBase64(gmailMessage.payload.parts[0].body.data)));
You can also check this for further reference.
try something like this
public String getMessage(string user_id, string message_id)
{
Message temp =service.Users.Messages.Get(user_id,message_id).Execute();
var parts = temp.Payload.Parts;
string s = "";
foreach (var part in parts) {
byte[] data = FromBase64ForUrlString(part.Body.Data);
s += Encoding.UTF8.GetString(data);
}
return s
}
public static byte[] FromBase64ForUrlString(string base64ForUrlInput)
{
int padChars = (base64ForUrlInput.Length % 4) == 0 ? 0 : (4 - (base64ForUrlInput.Length % 4));
StringBuilder result = new StringBuilder(base64ForUrlInput, base64ForUrlInput.Length + padChars);
result.Append(String.Empty.PadRight(padChars, '='));
result.Replace('-', '+');
result.Replace('_', '/');
return Convert.FromBase64String(result.ToString());
}

FAL insertion into sys_file TYPO3

I'm trying to insert a file into TYPO3 db through frontend using core functions or FileRepository, exactly into sys_file table.
While investigating I've seen few solutions like,
$storageRepository = \TYPO3\CMS\Core\Utility\GeneralUtility::makeInstance('TYPO3\\CMS\\Core\\Resource\\StorageRepository');
$storage = $storageRepository->findByUid(1);
$fileObject = $storage->addFile('/tmp/myfile', $storage->getRootLevelFolder(), 'newFile');
echo $fileObject->getIdentifier(); // Should output "/newFile"
But I still can't find this addFile() in storageRepository class. Am I missing some thing here?
The line $storageRepository->findByUid(1) return a ResourceStorage Object with the Method addFile().
Here is a Documenttion of this class.
https://typo3.org/api/typo3cms/class_t_y_p_o3_1_1_c_m_s_1_1_core_1_1_resource_1_1_resource_storage.html
#mario Thanks. By the way I've achieved what I planned. Here's what I did..
public function uploadFile($uploadedfile) {
$storage = GeneralUtility::makeInstance('TYPO3\\CMS\\Core\\Resource\\StorageRepository');
$filePath = 'uploads/tx_fileupload/'.$uploadedfile['updata']['name'];
$title = $uploadedfile['updata']['name'];
$size = $uploadedfile['updata']['size'];
// Moving the physical file to destined folder
\TYPO3\CMS\Core\Utility\GeneralUtility::upload_copy_move($uploadedfile['updata']['tmp_name'],$filePath);
// Adding a record in sys_file_storage
$fileObject = $storage->createLocalStorage($uploadedfile['updata']['name'],$uploadedfile['updata']['tmp_name'],$filePath,'');
// Inserting file in sys_file
$repositoryFileObject = \TYPO3\CMS\Core\Resource\ResourceFactory::getInstance()->retrieveFileOrFolderObject($filePath);
return $repositoryFileObject;
}
Now moving onto adding corresponding sys_file_reference record.

Bulk copy CSV files using NPGSQL 3.0.5 BinaryImporter

I have upgraded to NPGSQL 3.0.5 and realized the NpgsqlCopyIn is no more available. With older version I could process CSV file with NpgsqlCopyIn which is really fast and efficient for bulk copying huge data. I used
var copystr = "COPY tablename (col1,col2,etc) FROM 'csv file' STDIN WITH DELIMITER ',' CSV HEADER" ;
NpgsqlCommand dbCommand = new NpgsqlCommand(copyStr, _DataStoreConnection);
NpgsqlCopyIn copyIn = new NpgsqlCopyIn(dbCommand, _DataStoreConnection, stream);
copyIn.Start();
But with 3.0 version, I couldn't find a way for bulk copying by just letting the binary importer that data is CSV. Instead I use the below code
StreamReader streamReader = null;
try {
streamReader = new StreamReader(fileStream);
{
var copyStr = string.Format("COPY {0} ({1}) FROM STDIN (FORMAT BINARY)", _DataStoreName, string.Join(",", _DataStoreColumns.Select(a => a.ToLower())));
if (_DataStoreConnection.State == ConnectionState.Closed)
_DataStoreConnection.Open();
string csvLine = string.Empty;
while ((csvLine = streamReader.ReadLine()) != null)
{
if (lineCount > 0)
{
using (var importWriter = _DataStoreConnection.BeginBinaryImport(copyStr))
{
importWriter.WriteRow(csvLine.Split(','));
}
}
else
{
lineCount++; //This is to skip the first line from the CSV file. First line will be header, so skip it.
}
}
}
}
Is there a way where I can specify the BinaryImporter that the input data is CSV, so that it takes care of delimiter and inserting the data to datastore as in NpgSqlCopyIn?
You can't use the binary importer to import CSV because, well, CSV isn't binary :)
The binary importer is a more efficient way to import your data, see an example in the Npgsql docs. If you absolutely have to import CSV, you can still do that via BeginTextImport but you have to format the text yourself, i.e. put the delimiter and so forth. See the docs for that as well.

Preforming Bulk data transactions with SalesForce using .Net C#

I am new to SalesForce (3 months).
Thus far I have been able to create an application in C# that I can use to preform Inserts and Updates to the SalesForce database. These transactions are one at a time.
No I have the need to preform large scale transactions. For example updating thousands of records at a time. Doing them one by one would quickly put us over our allotted API calls per 24 hour period.
I want to utilize the available bulk transactions process to cut down on the number of API calls. Thus far I have not had much luck coding this nor have I found any such documentation.
If anyone could either provide some generic examples or steer me to reliable documentation on the subject I would greatly appreciate it.
FYI, the data I need to use to do the updates and inserts comes from an IBM Unidata database sitting on an AIX machine. So direct web services communication is not realy possible. Getting the data from Unidata has been my headache. I have that worked out. Now the bulk api to SalesForce is my new headache.
Thanks in advance.
Jeff
You don't mention which API you're currently using, but using the soap partner or enterprise APIs you can write records to salesforce 200 at a time. (the create/update/upsert calls all take an array of SObjects).
Using the bulk API you can send data in chunks of thousands of rows at a time.
You can find the documentation for both sets of APIs here
The answers already given are a good start; however, are you sure you need to actually write a custom app that uses the bulk API? The salesforce data loader is a pretty robust tool, includes a command line interface, and can use either the "normal" or bulk data API's. Unless you are needing to do fancy logic as part of your insert/updates, or some sort of more real-time / on-demand loading, the data loader is going to be a better option than a custom app.
(this is the SOAP code though, not the Salesforce "Bulk API" ; careful not to confuse the two)
Mighy be below code provide clear insight on how to do bulk insertion.
/// Demonstrates how to create one or more Account records via the API
public void CreateAccountSample()
{
Account account1 = new Account();
Account account2 = new Account();
// Set some fields on the account1 object. Name field is not set
// so this record should fail as it is a required field.
account1.BillingCity = "Wichita";
account1.BillingCountry = "US";
account1.BillingState = "KA";
account1.BillingStreet = "4322 Haystack Boulevard";
account1.BillingPostalCode = "87901";
// Set some fields on the account2 object
account2.Name = "Golden Straw";
account2.BillingCity = "Oakland";
account2.BillingCountry = "US";
account2.BillingState = "CA";
account2.BillingStreet = "666 Raiders Boulevard";
account2.BillingPostalCode = "97502";
// Create an array of SObjects to hold the accounts
sObject[] accounts = new sObject[2];
// Add the accounts to the SObject array
accounts[0] = account1;
accounts[1] = account2;
// Invoke the create() call
try
{
SaveResult[] saveResults = binding.create(accounts);
// Handle the results
for (int i = 0; i < saveResults.Length; i++)
{
// Determine whether create() succeeded or had errors
if (saveResults[i].success)
{
// No errors, so retrieve the Id created for this record
Console.WriteLine("An Account was created with Id: {0}",
saveResults[i].id);
}
else
{
Console.WriteLine("Item {0} had an error updating", i);
// Handle the errors
foreach (Error error in saveResults[i].errors)
{
Console.WriteLine("Error code is: {0}",
error.statusCode.ToString());
Console.WriteLine("Error message: {0}", error.message);
}
}
}
}
catch (SoapException e)
{
Console.WriteLine(e.Code);
Console.WriteLine(e.Message);
}
}
Please find the small code which may help you to insert the data into salesforce objects using c# and WSDL APIs. I stuck to much to write code in c#. I assigned using direct index after spiting you can use your ways.
I split the column using | (pipe sign). You may change this and also <br>, \n, etc. (row and column breaking)
Means you can enter N rows which are in your HTML/text file. I wrote the program to add order by my designers who put the order on other website and fetch the data from e-commerce website and who has no interface for the salesforce to add/view the order records. I created one object for the same. and add following columns in the object.
Your suggestions are welcome.
private SforceService binding; // declare the salesforce servive using your access credential
try
{
string stroppid = "111111111111111111";
System.Net.HttpWebRequest fr;
Uri targetUri = new Uri("http://abc.xyz.com/test.html");
fr = (System.Net.HttpWebRequest)System.Net.HttpWebRequest.Create(targetUri);
if ((fr.GetResponse().ContentLength > 0))
{
System.IO.StreamReader str = new System.IO.StreamReader(fr.GetResponse().GetResponseStream());
string allrow = str.ReadToEnd();
string stringSeparators = "<br>";
string[] row1 = Regex.Split(allrow, stringSeparators);
CDI_Order_Data__c[] cord = new CDI_Order_Data__c[row1.Length - 1];
for (int i = 1; i < row1.Length-1; i++)
{
string colstr = row1[i].ToString();
string[] allcols = Regex.Split(colstr, "\\|");
cord[i] = new CDI_Order_Data__c(); // Very important to create object
cord[i].Opportunity_Job_Order__c = stroppid;
cord[i].jobid__c = stroppid;
cord[i].order__c = allcols[0].ToString();
cord[i].firstname__c = allcols[1].ToString();
cord[i].name__c = allcols[2].ToString();
DateTime dtDate = Convert.ToDateTime(allcols[3]);
cord[i].Date__c = new DateTime(Convert.ToInt32(dtDate.Year), Convert.ToInt32(dtDate.Month), Convert.ToInt32(dtDate.Day), 0, 0, 0); //sforcedate(allcols[3]); //XMLstringToDate(allcols[3]);
cord[i].clientpo__c = allcols[4].ToString();
cord[i].billaddr1__c = allcols[5].ToString();
cord[i].billaddr2__c = allcols[6].ToString();
cord[i].billcity__c = allcols[7].ToString();
cord[i].billstate__c = allcols[8].ToString();
cord[i].billzip__c = allcols[9].ToString();
cord[i].phone__c = allcols[10].ToString();
cord[i].fax__c = allcols[11].ToString();
cord[i].email__c = allcols[12].ToString();
cord[i].contact__c = allcols[13].ToString();
cord[i].lastname__c = allcols[15].ToString();
cord[i].Rep__c = allcols[16].ToString();
cord[i].sidemark__c = allcols[17].ToString();
cord[i].account__c = allcols[18].ToString();
cord[i].item__c = allcols[19].ToString();
cord[i].kmatid__c = allcols[20].ToString();
cord[i].qty__c = Convert.ToDouble(allcols[21]);
cord[i].Description__c = allcols[22].ToString();
cord[i].price__c = Convert.ToDouble(allcols[23]);
cord[i].installation__c = allcols[24].ToString();
cord[i].freight__c = allcols[25].ToString();
cord[i].discount__c = Convert.ToDouble(allcols[26]);
cord[i].salestax__c = Convert.ToDouble(allcols[27]);
cord[i].taxcode__c = allcols[28].ToString();
}
try {
SaveResult[] saveResults = binding.create(cord);
}
catch (Exception ce)
{
Response.Write("Buld order update errror" +ce.Message.ToString());
Response.End();
}
if (str != null) str.Close();
}

Unzip from a SQL Server text column to an image column

I have images of various formats (.png, .jpg, .bmp, etc.) stored as compressed text in a text column in a SQL Server 2005 table. I need to read the row, unzip the image and store it in an image column in another table.
I am using the SharpZip library, and all of the examples deal with file sources and destinations. I can't find anything that covers unzipping from a variable to another variable. A code snippet illustrating this or a link to a relevant resource would be much appreciated.
EDIT: A bit more information - the data is stored in a TEXT column. It appears as follows (text column abbreviated for display):
ImageID ImageData
1 FORMAT-ZIPV3 UEsDBBQAAAAIAOV6wzxdTnDvshs...
2 FORMAT-ZIPV3 UEsDBBQAAAAIAAF2yjxGncjOLgA...
3 FORMAT-ZIPV3 UEsDBBQAAAAIAKd6yjyjnQNr6gg...
4 FORMAT-ZIPV3 UEsDBBQAAAAIALdNyzyrPC8EMJw...
5 FORMAT-ZIPV3 UEsDBBQAAAAIAA1rOD1nZY1t0f0...
6 FORMAT-ZIPV3 UEsDBBQAAAAIANZplj2seyJ+VmM...
7 FORMAT-ZIPV3 UEsDBBQAAAAIAC5vhD27LPbPcv8...
8 FORMAT-ZIPV3 UEsDBBQAAAAIAK1qKz5DJNH3xMg...
9 FORMAT-ZIPV3 UEsDBBQAAAAIAHVkEztC3th/9hs...
10 FORMAT-ZIPV3 UEsDBBQAAAAIAEtXKz7DXHUdvow...
What I know for certain is that the images were compressed at some point in the process using SharpZip before being inserted into the table. It appears that the format information was added to the beginning of the data prior to inserting.
Looking at this data, would anyone have any insight on how this image data has been manipulated? Again, I need to get the uncompressed image data into a column of a data type conducive to reading for display on a web page.
EDIT: Ok, I'm stumped. Executing the following code produces the error, "Failed to convert parameter value from a Int32 to a Byte[]". It appears to be placing the length of the byte array into the byte array's value...
commandUncompressed.Connection = connectionUncompressed;
commandUncompressed.Parameters.Add("#Image_k", SqlDbType.VarChar, 10);
commandUncompressed.Parameters.Add("#ImageContents", SqlDbType.Image);
commandUncompressed.CommandText = sqlSaveImage;
connectionUncompressed.Open();
reader = command.ExecuteReader();
if (reader.HasRows)
{
while (reader.Read())
{
Console.WriteLine(reader["Image_k"].ToString()); // Merely for testing
String format = reader["ImageContents_Compressed"].ToString().Substring(0, 12);
var offset = 13; //"FORMAT-ZIPV3 ".Length;
var s = reader["ImageContents_Compressed"].ToString().Substring(offset);
var bytes = Convert.FromBase64String(s);
if (format == "FORMAT-ZIPV2 ")
{
bytes = ConvertStringToBytes(s); // Not a Base-64 encoded string? External conversion function utilized.
}
using (var zis = new ZipInputStream(new MemoryStream(bytes)))
{
ZipEntry zipEntry = zis.GetNextEntry(); // Doesn't seem to work unless an entry has been referenced
byte[] buffer = new byte[zis.Length];
commandUncompressed.Parameters["#Image_k"].Value = reader["Image_k"].ToString();
commandUncompressed.Parameters["#ImageContents"].Value = zis.Read(buffer, 0, buffer.Length);
commandUncompressed.ExecuteNonQuery();
}
}
}
It appears to be reading the data from the source text column just fine. I just cannot figure out how to get that into the image type parameter. The value for buffer variable shows the length of the byte array, rather than the actual bytes. Maybe that's what the value property typically shows for byte arrays? I'm so close and yet so far away. :/
EDIT: Ok, I'm a knucklehead. I made the following correction, and it works!
zis.Read(buffer, 0, buffer.Length)
commandUncompressed.Parameters["#ImageContents"].Value = buffer;
At this point I am only able to process FORMAT-ZIPV3 data, as I haven't figured out how to decode the FORMAT-ZIP2 strings yet. Following is a sampling of the V2 data. If anyone is able to determine the encoding, let me know. Would it be different if zipped using BZIP instead of ZIP format?
ImageID ImageData
1 FORMAT-ZIPV2 504B03041400020008005157422A2E25FDBAF26701008D6901000E...
2 FORMAT-ZIPV2 504B03041400020008009159422A7FC94BA2B2540500D35705000E...
3 FORMAT-ZIPV2 504B0304140002000800685A422A0CAA51F4473A0600B97206000E...
4 FORMAT-ZIPV2 504B03041400020008001D5D422A770BD3ED201902002C4A02000E...
5 FORMAT-ZIPV2 504B0304140002000800325E422A4B6C2FB4045001001C6E01000E...
6 FORMAT-ZIPV2 504B03041400020008006F72422A5F793AC1A1F00200ECF302000E...
7 FORMAT-ZIPV2 504B0304140002000800D572422A1B348A731DE5000085EB00000E...
8 FORMAT-ZIPV2 504B03041400020008003D73422A8AEBB7F855640300DD1B04000E...
9 FORMAT-ZIPV2 504B03041400020008006368D528C5D0A6BA794900004A2502000E...
10 FORMAT-ZIPV2 504B03041400020008008E5B6C2A2D9E9C33D7AF05005CEC05000E...
In response to a similar question, someone on sqlmonster.com provided a nifty VarBinaryStream class. It works with a column type of varbinary(max).
If your data is stored in a varbinary(max), and is in zip format, you could use that class to instantiate a VarBinaryStream, then instantiate a ZipInputStream around that, and ba-da-boom, you're there. Just read from the ZipInputStream.
In C# it might look like this
using (var imageSrc = new VarBinarySource(connection,
"Table.Name",
"Column",
"KeyColName",
1))
{
using (var s = new VarBinaryStream(imageSrc))
{
using(var zis = new ZipInputStream(s))
{
....
}
}
}
If the images are small, then you probably wouldn't want all this streaming stuff. If the column is a binary(n) or a varbinary(n) where n is less than 8000, just use the SqlBinary type and read in all the data into memory, then instantiate a MemoryStream around that. Simpler. In VB.NET it looks something like this:
Dim bytes as Bytes()
bytes = dr.GetSqlBinary(columnNumber)
Using ms As New MemoryStream(bytes)
Using zis As New ZipInputStream(ms)
...
End Using
End Using
Finally, I'm going to question the wisdom of applying zip compression to .jpg images, and similar. The jpg format is already compressed; compressing it again before putting the data into SQL Server won't cause the data to become appreciably smaller. It only increases processing time. If possible, I'd suggest you reconsider your design for storage of compressed images.
ok, with the update you provided, containing the data format, you can draw some conclusions.
The data is an actual string. Suspecting that it is a Base64-encoded string, I did a small test and used Convert.ToBase64String() on a byte stream that contains a zip file. It looks like this: UEsDBBQAAAAIAJJyYyk3M56F+QIAA...
Aha! you have a base64-encoded (string) version of the byte data for a bonafide zip file. To decode it, strip the prefix and then use FromBase64String() to get the byte array, insert into a MemoryStream, then read it with ZipInputStream.
something like this:
var offset = "FORMAT-ZIPV3 ".Length();
var s = sqlReader["CompressedImage"].ToString().Substring(offset);
var bytes = Convert.FromBase64String(s);
using (var zis = new ZipInputStream(new MemoryStream(bytes)))
{
...
zis.Read(...);
...
}
If the data is "really long", you're going to want to stream it out of that table, rather than just read it into a big string and convert it. I don't know how large text columns can be, but supposing that it could be 500mb, you don't want a 500mb string, and you don't want to do a conversion of a 500mb string with Convert.FromBase64String(). In that case You need to use a Base64Stream, or the FromBase64Transform class in the System.Security.Cryptography namespace.
Editorial comment. It is sort of backwards to zip-compress image data. The images are probably compressed already. But to compound that backwardsness by then doing a base64 encode, thereby expanding the data... ??? That is triple backwards. That makes noooooo sense at all. I understand that's how your vendor supplied it.
Ok, with your furhter update, using this as the format:
ImageID ImageData
1 FORMAT-ZIPV2 504B03041400020008005157422A2E25FDBAF26701008D6901000E...
2 FORMAT-ZIPV2 504B03041400020008009159422A7FC94BA2B2540500D35705000E...
That data is still zipfile data, but it is encoded as simple hex digits. You need to convert that to a byte array. Here's some code to do it.
public static class ConvertEx
{
static readonly String prefix= "FORMAT-ZIPV2 ";
public static string ToHexString(byte[] b)
{
System.Text.StringBuilder sb1 = new System.Text.StringBuilder();
int i = 0;
for (i = 0; i < b.Length; i++)
{
sb1.Append(System.String.Format("{0:X2}", b[i]));
}
return sb1.ToString().ToLower();
}
public static byte[] ToByteArray(string s)
{
if (s.StartsWith(prefix))
{
System.Console.WriteLine("removing prefix");
s = s.Substring(prefix.Length);
}
s= s.Trim(); // whitespace
System.Console.WriteLine("length: {0}", s.Length);
var r= new byte[s.Length/2];
for (int i = 0; i < s.Length; i+=2)
{
r[i/2] = (byte) Convert.ToUInt32(s.Substring(i,2), 16);
}
return r;
}
}
You can use that this way:
string s = GetStringContentFromDatabase()
var decoded = ConvertEx.ToByteArray(s);
using (var ms = new MemoryStream(decoded))
{
// use DotNetZip to read the zip file
// SharpZipLib is something similar...
using (var zip = ZipFile.Read(ms))
{
// print out the list of entries in the zipfile
foreach (var e in zip)
{
System.Console.WriteLine("{0}", e.FileName);
}
}
}
The examples on the SharpZip Wiki use Stream objects - while the sample does use a File, you could easily use a MemoryStream object here and the sample would work the same.

Resources