Inserting many converted leads through Apex data loader(Salesforce) - salesforce

If I have some leads in my Salesforce org which are already converted to some Accounts or Opportunity and now I have to insert the same converted leads to my other Salesforce Org then how it can be done for bulk records.
I have the required permission for converting the LEADS i.e. Edit and Create on Accounts, Opportunity, Contacts along with 'convert leads' permission.
I am seeking for the steps to be followed to accomplish this with full details.
Thanks
Chirag

I got this sorted out.
In order to insert Converted Leads into SALESFORCE, we have to raise a case in SF in order to enable these fields: CONVERTEDACCOUNTID, CONVERTEDCONTACTID, CONVERTEDOPPORTUNITYID and CONVERTEDDATE in editable mode.
These are the audit fields, once enabled then you can load/insert converted leads into your environment.
Following Points to Consider:
Firstly, Load Account, Contact and Opportunity which needs to be associated with the converted lead.
Next prepare a csv files with information of lead and get the ids from previous step and load it into the appropriate converted fields of file as mentioned above.
The field Isconverted should be set to true.
The CONVERTEDDATE must be created after the Lead's CreatedDate, Otherwise it will give error. [this is a common sence ;)]
Also these Audit Fields are automatically disabled in two weeks, so try to finish your work within this time.
Thanks & Regards
Chirag

Export data from salesforce Org1. OR use 'salesforce to salesforce' sharing feature to send data from one org to another org.
Write a apex job to mass convert leads to accounts.

Related

Get audit history records of any entity record as per CRM view

I want to display all audit history data as per MS CRM format.
I have imported all records from AuditBase table from CRM to another Database server table.
I want this table records using SQL query in Dynamics CRM format (as per above image).
I have done so far
select
AB.CreatedOn as [Created On],SUB.FullName [Changed By],
Value as Event,ab.AttributeMask [Changed Field],
AB.changeData [Old Value],'' [New Value] from Auditbase AB
inner join StringMap SM on SM.AttributeValue=AB.Action and SM.AttributeName='action'
inner join SystemUserBase SUB on SUB.SystemUserId=AB.UserId
--inner join MetadataSchema.Attribute ar on ab.AttributeMask = ar.ColumnNumber
--INNER JOIN MetadataSchema.Entity en ON ar.EntityId = en.EntityId and en.ObjectTypeCode=AB.ObjectTypeCode
--inner join Contact C on C.ContactId=AB.ObjectId
where objectid='00000000-0000-0000-000-000000000000'
Order by AB.CreatedOn desc
My problem is AttributeMask is a comma separated value that i need to compare with MetadataSchema.Attribute table's columnnumber field. And how to get New value from that entity.
I have already checked this link : Sql query to get data from audit history for opportunity entity, but its not giving me the [New Value].
NOTE : I can not use "RetrieveRecordChangeHistoryResponse", because i need to show these data in external webpage from sql table(Not CRM database).
Well, basically Dynamics CRM does not create this Audit View (the way you see it in CRM) using SQL query, so if you succeed in doing it, Microsoft will probably buy it from you as it would be much faster than the way it's currently handled :)
But really - the way it works currently, SQL is used only for obtaining all relevant Audit view records (without any matching with attributes metadata or whatever) and then, all the parsing and matching with metadata is done in .NET application. The logic is quite complex and there are so many different cases to handle, that I believe that recreating this in SQL would require not just some simple "select" query, but in fact some really complex procedure (and still that might be not enough, because not everything in CRM is kept in database, some things are simply compiled into the libraries of application) and weeks or maybe even months for one person to accomplish (of course that's my opinion, maybe some T-SQL guru will prove me wrong).
So, I would do it differently - use RetrieveRecordChangeHistoryRequest (which was already mentioned in some answers) to get all the Audit Details (already parsed and ready to use) using some kind of .NET application (probably running periodically, or maybe triggered by a plugin in CRM etc.) and put them in some Database in user-friendly format. You can then consume this database with whatever external application you want.
Also I don't understand your comment:
I can not use "RetrieveRecordChangeHistoryResponse", because i need to
show these data in external webpage from sql table(Not CRM database)
What kind of application cannot call external service (you can create a custom service, don't have to use CRM service) to get some data, but can access external database? You should not read from the db directly, better approach would be to prepare a web service returning the audit you want (using CRM SDK under the hood) and calling this service by external application. Unless of course your external app is only capable of reading databases, not running any custom web services...
It is not possible to reconstruct a complete audit history from the AuditBase tables alone. For the current values you still need the tables that are being audited.
The queries you would need to construct are complex and writing them may be avoided in case the RetrieveRecordChangeHistoryRequest is a suitable option as well.
(See also How to get audit record details using FetchXML on SO.)
NOTE
This answer was submitted before the original question was extended stating that the RetrieveRecordChangeHistoryRequest cannot be used.
As I said in comments, Audit table will have old value & new value, but not current value. Current value will be pushed as new value when next update happens.
In your OP query, ab.AttributeMask will return comma "," separated values and AB.changeData will return tilde "~" separated values. Read more
I assume you are fine with "~" separated values as Old Value column, want to show current values of fields in New Value column. This is not going to work when multiple fields are enabled for audit. You have to split the Attribute mask field value into CRM fields from AttributeView using ColumnNumber & get the required result.
I would recommend the below reference blog to start with, once you get the expected result, you can pull the current field value using extra query either in SQL or using C# in front end. But you should concatenate again with "~" for values to maintain the format.
https://marcuscrast.wordpress.com/2012/01/14/dynamics-crm-2011-audit-report-in-ssrs/
Update:
From the above blog, you can tweak the SP query with your fields, then convert the last select statement to 'select into' to create a new table for your storage.
Modify the Stored procedure to fetch the delta based on last run. Configure the sql job & schedule to run every day or so, to populate the table.
Then select & display the data as the way you want. I did the same in PowerBI under 3 days.
Pros/Cons: Obviously this requirement is for reporting purpose. Globally reporting requirements will be mirroring database by replication or other means and won't be interrupting Prod users & Async server by injecting plugins or any On demand Adhoc service calls. Moreover you have access to database & not CRM online. Better not to reinvent the wheel & take forward the available solution. This is my humble opinion & based on a Microsoft internal project implementation.

DB2 row level access control: how to pass a user Id

In our web application we want to use DB2 row level access control to control who can view what. Each table would contain a column named userId which contain the user id. We want log-in users be able to see only row's usereId column with theirs id. I have seen db2 permission examples using DB2 session_id or user, for example taking DB2 given Banking example :
CREATE PERMISSION EXAMPLEBANKING.IN_TELLER_ROW_ACCESS
ON EXAMPLEBANKING.CUSTOMER FOR ROWS WHERE BRANCH in (
SELECT HOME_BRANCH FROM EXAMPLEBANKING.INTERNAL_INFO WHERE EMP_ID = SESSION_USER
)
ENFORCED FOR ALL ACCESS
ENABLE;
Our table gets updated dynamically hence we don't know what row get added or deleted hence we don't know what are all the user Id in the table.
At any given time, different user would log-on to the web to view information retrieve from the tables, the permission declaration above only take SESSION_USER as the input, can I change it to something like Java function parameter where one can pass arbitrary id to the permission? If not then how do I handle different log-in users at arbitrary time? Or do I just keep changing SESSION_USER dynamically as new user login (using "db2 set" ??)? If so then is this the best practice for this kind use case?
Thanks in advance.
Since the user ID in question is application-provided, not originating from the database, using SESSION_USER, which equals to the DB2 authorization ID, would not be appropriate. Instead you might use the CLIENT_USERID variable, as described here.
This might become a little tricky if you use connection pooling in your application, as the variable must be set each time after obtaining a connection from the pool and reset before returning it to the pool.
Check out Trusted Contexts, this is exactly why they exist. The linked article is fairly old (you can use trusted contexts with PHP, ruby, etc. now).

SQL Report Service Subscription with dynamic report paramater's values

Background
I have a report with 3 parameters: AccountId, FromDate and ToDate. The report is invoice layout. Customer want to view all members which means we have 300 members, system will generates 300 reports in pdf or excel format and send to customer.
Question
How to set member id for this in subscription? I cannot do one by one and create 300 subscriptions in manually :|
If you're not clear, please comment below and I will correct it asap.
Updated:
The data-driven subscription which Manoj deal with is required SQL Report has Enterprise or Developer editon.
If I don't have, do you have any solution for work around?
If you are stuck with the Standard edition of SQL Server, you'll need to create a SSIS package to generate the reports. I used the article below to set up a package that now creates and emails our invoices, order confirmations, and shipping acknowledgments. I like this article over other ones I found because it's so easy to add more reports to it as you go along without having to create a new package each time.
If you plan to use this for more than one report, I would change the parameter to be a PK so that you know you're always going to pass in one integer regardless of which report you're calling. Another change I made was to create one table for the report generation piece, and one for the email piece. In my case, I only want to send one email that may have multiple attachments, so that was the best way to do it. In your stored proc that builds this data, make sure you have some checks for if the email is valid.
update TABLE
set SendStatus = 'Invalid Email Address'
where email NOT LIKE '%_#__%.__%' --valid email format
or patindex ('%[ &'',":;!=\/()<>]%', email) > 0 -- Invalid characters
or patindex ('[#.-_]%', email) > 0 -- Valid but cannot be starting character
or patindex ('%[#.-_]', email) > 0 -- Valid but cannot be ending character
or email not like '%#%.%' -- Must contain at least one # and one .
or email like '%..%' -- Cannot have two periods in a row
or email like '%#%#%' -- Cannot have two # anywhere
or email like '%.#%' or email like '%#.%' -- Cant have # and . next to each other
--0
When I set this up, I had a lot of issues getting the default credentials to work. The SSRS services would crash every time, but the rest of the SQL services would keep working. I ended up having our network guy make a new set with a static password just for this. If you do that, change the default credential line to this one.
rs.Credentials = new System.Net.NetworkCredential("acct", "p#ssword", "domain");
If you run into errors or issues, the SSRS logs and Google are your best friends. Or leave a comment below and I'll help.
Here's that article: http://technet.microsoft.com/en-us/library/ff793463(v=sql.105).aspx
use this link for reference:
http://beyondrelational.com/modules/2/blogs/101/posts/13460/ssrs-60-steps-to-implement-a-data-driven-subscription.aspx
create your SQL statement like this:
This will solve your problem.

Set up relation on two existing Salesforce objects

I have a custom object in Salesforce which I need to setup a Master Detail relationship from Accounts. Accounts being the Master and CompHist being the Detail. The problem I am running into is that I need to set the relation to work off of custom fields within the objects. Example:
1.) Accounts has a custom field called CustomerId.
2.) CompHist also has custom field called CustomerId.
3.) I need to be able to have this linked together by CustomerId field for report generation.
About 2,000 records are inserted into CompHist around the 8th of each month. This is done from a .NET application that kicks off at the scheduled time, collects info from our databases and then uploads that data to salesforce via the SOAP API.
Maybe I'm misunderstanding how Salesforce relationships work as I am fairly new (couple months) to salesforce development.
Thanks,
Randy
There is a way to get this to work without triggers that will link the records or pre-querying the SF to learn Account Ids in .NET before you'll push the CompHistories.
Setup
On Account: set the "External ID" checkbox on your CustomerId field. I'd recommend setting "Unique" too.
On CompHist: you'll need to make decision whether it's acceptable to move them around or when the relation to Account is set - it'll stay like that forever. When you've made that decision tick / untick the "reparentable master-detail" in the definition of your lookup / m-d to Account.
And if you have some Id on these details, something like "line item number" - consider making an Ext. Id. for them too. Might save your bacon some time in future when end user questions the report or you'll have to make some kind of "flush" and push all lines from .NET (will help you figure out what's to insert, what's to update).
At this point it's useful to think how are you going to fill the missing data (all the nulls in the Ext. Id) field.
Actual establishing of the relationship
If you have the external ids set it's pretty easy to tell salesforce to figure out the linking for you. The operation is called upsert (mix between update and insert) and can be used in 2 flavours.
"Basic" upsert is for create/update solving; means "dear Salesforce, please save this CompHist record with MyId=1234. I don't know what's the Id in your database and frankly I don't care, go figure this out will ya?"
If there was no such record - 1 will be created.
If there was exactly 1 match - it will be updated.
If there were more than 1 found - SF won't know which one to update and throw error back at you (that's why marking as "unique" is a good idea. There's a chance you'll spot errors sooner).
"Advanced" upsert is for maintaining foreign keys, establishing lookups. "Dear SF, please hook this CompHist up to Account which is marked as "ABZ123" in my DB. Did I mention I don't care about your Ids and I can't be bothered to query your database first prior to me uploading my stuff?"
Again - exact match - works as expected.
0 or 2 Accounts with same ext. id value = error.
Code plz
I'd recommend you to play with Data Loader or similar tool first to get a grasp. of what exactly happens, how to map fields and how to not be confused (these 2 flavours of upsert can be used at same time). Once you'll manage to push the changes the way you want you can modify your integration a bit.
SOAP API upsert: http://www.salesforce.com/us/developer/docs/api/Content/sforce_api_calls_upsert.htm (C# example at the bottom)
REST API: http://www.salesforce.com/us/developer/docs/api_rest/Content/dome_upsert.htm
If you'd prefer an Salesforce Apex example: Can I insert deserialized JSON SObjects from another Salesforce org into my org?

Retrieving deleted records from Apex data loader?

Does anyone have any idea how to retrieve deleted records data from Apex data loader or otherwise from Salesforce except from the Web service?
Check the documentation: https://na7.salesforce.com/help/doc/en/salesforce_data_loader.pdf
If using the GUI version v20 or above, you'll have the Export All button.
From the Command Line version, the process-config.xml file should have the process.operation attribute value set equal to "extract_all" (the documentation states "Extract All" but that doesn't work).
Using either of these above options will extract soft deleted records, and will allow you to filter on IsDeleted = true or false. (You can include this filter regardless, but without using the above options, IsDeleted=true will always return zero records).
Hope that helps.
P.S. In Apex, it's slightly different. Your SOQL query will be [Select Id from Account where IsDeleted=false all rows] The 'all rows' appendage is the Apex equivalent of 'extract all'.
In Dataloader, use the EXPORT ALL button, not the EXPORT button
This gives you access to deleted & archived records.
You can't. The only way to get deleted records through the API is to use queryAll, and DataLoader doesn't use queryAll ever.
(Sorry for the resurrection here.)
Roll them back with a few lines of Apex code in the System Log. For instance:
Account[] a = [select id from Account where isDeleted=true ALL ROWS];
undelete a;
system.debug(a);
This should work as long as you didn't use emptyRecycleBin() (which will still return query results, but won't allow undelete as the records would now be marked for physical deletion). Take a few of the ids from the USER_DEBUG results for a to confirm that it worked.
Try extract, extract_all, hard_delete.
I hope it's not to late.
There are three ways to do it.
Recycle bin. In recycle bin change the option to all recycle bin. It is like soft delete we can get the record. If you didn't get your record from recycle bin
Workbench. In workbench select soql query and your required object and create a query like this example.
SELECT Id,Name,AccountId,Isdeleted,CreatedDate,StageName
FROM Opportunity where isdeleted =true
in this section we didn't get the record we know the information of the opportunity record.
Dataloader. It also works like workbench and you can retrive the information of the record. Select exportall option and select the required fields and put a filter like is deleted is true.

Resources