How to Create WorkFlow Field Update using MetaData Tooling Api
I am creating a metadataservice class and that object . thorw a object i am creating workflow field update but it is not working
MetadataService.MetadataPort service = new MetadataService.MetadataPort();
service.SessionHeader = new MetadataService.SessionHeader_element();
service.SessionHeader.sessionId = UserInfo.getOrganizationId().substring(0, 15) + ' ' + UserInfo.getSessionId().substring(15);
MetadataService.WorkflowFieldUpdate workflowFieldUpdate = new MetadataService.WorkflowFieldUpdate();
// Workflow Field Update
workflowFieldUpdate.fullName = 'TEST_Active_Permission';
workflowFieldUpdate.description = 'Activates a permission.';
workflowFieldUpdate.field = 'Expense__c.Status__c';
workflowFieldUpdate.literalValue = '1';
workflowFieldUpdate.name = 'TEST Active Permission';
workflowFieldUpdate.notifyAssignee = false;
workflowFieldUpdate.operation = 'Literal';
workflowFieldUpdate.protected_x = false;
workflowFieldUpdate.reevaluateOnChange = true;
workflowFieldUpdate.targetObject = 'Expense__c';
MetadataService.WorkflowAction wfp = workflowFieldUpdate;
MetadataService.Metadata[] theMetadata = new MetadataService.Metadata[]{};
theMetadata.add(wfp);
MetadataService.SaveResult[] results = service.createMetadata(theMetadata);
system.debug('results'+results);
That's not Tooling API, that's old school Metadata API. Somebody took the metadata API WSDL file and imported it back to SF. What error are you getting?
Keep in mind that since Winter'23 release (~September 2022) you can't create new workflow rules. Button is disabled in UI too. Field updates... you probably still can but why do you cling to retired automation?
https://admin.salesforce.com/blog/2021/go-with-the-flow-whats-happening-with-workflow-rules-and-process-builder
Note that in Metadata API documentation there's no top-level entry for WorkflowFieldUpdate. It's possible you have to create Workflow, wrap your thing in it. https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_workflow.htm Tooling API has separate entry (https://developer.salesforce.com/docs/atlas.en-us.api_tooling.meta/api_tooling/tooling_api_objects_workflowfieldupdate.htm) but you'd need to ditch this hack and use JSON.
Related
I run an Android app which locates OBJECTS with attributes like ID, Name, Owner, Type, Place_ID which are linked to PLACES, on a map. PLACES have attributes like ID, Latitude, Longitude, Opening Hour, Closing Hour,... The data is stored in a MongoDB on Back4App and I want to keep that way. I have one class for OBJECTS and one class for PLACES. The relation between OBJECTS and PLACES is not "a MongoDB relation", it is just a common String field in the OBJECTS and PLACES classes.
In order to allow offline access to the data and to minimize DB server requests, the app synchronizes a local SQLITE database on the device with the MongoDB online database. In the Android App, the queries are passed to the SQLITE DB.
I'm trying to make a website which does the same job as the app, which is displaying filtered data from the MongoDB.
I started with a simple html and javascript website using the Parse SDK, but I'm facing a few difficulties.
A simple query is to list all the OBJECTS in a 50km radius, i.e. I need the OBJECTS and the PLACE where they are located. However, where I could get this easilty with a SELECT...JOIN in SQLITE, I cannot get this information through a simple Parse query because I want to know the OBJECTS too. And I cannot run 2 asynchronous queries in a for loop.
What website architecture and/or languages would you recommend for this type of website ?
How would you recommend to proceed ?
Thanks in advance for your help.
EDIT: ZeekHuge opened my eyes on the bad design of not using pointers. After implementing pointers in my MongoDB, here's the lines of codes which did it for me :
Parse.initialize("", "");
Parse.serverURL = '';
var eiffel = new Parse.GeoPoint(48.858093, 2.294694);
var myScores = '';
var Enseigne = Parse.Object.extend("ENSEIGNE");
var Flipper = Parse.Object.extend("FLIPPER");
var query = new Parse.Query(Flipper);
var innerquery = new Parse.Query(Enseigne);
innerquery.withinKilometers("ENS_GEO",eiffel,500);
query.equalTo("FLIP_ACTIF", true);
query.include("FLIP_ENSPOINT");
query.include("FLIP_MODPOINT");
query.matchesQuery("FLIP_ENSPOINT", innerquery);
query.find({
success: function(results) {
for (var i = 0; i < results.length; i++) {
var object = results[i];
myScores += '<tr><td>' + object.get('FLIP_MODPOINT').get('MOFL_NOM')
+ '</td><td>' + object.get('FLIP_ENSPOINT').get('ENS_NOM')
+ '</td><td>' + object.get('FLIP_ENSPOINT').get('ENS_GEO').latitude
+ '</td><td>' + object.get('FLIP_ENSPOINT').get('ENS_GEO').longitude
+ '</td></tr>';
}
(function($) {
$('#results-table').append(myScores);
})(jQuery);
},
error: function(error) {
alert("Error: " + error.code + " " + error.message);
}
});
Solved by replacing the database keys by pointers and using the Innerquery and include functions. See exemple mentionned in question.
I am writing an Electron application and I want to display some data from a local sqlite3 database file. I am using React as my front-end framework and Redux to update the table data. However I am having trouble figuring out what's the best way to query from the .db file and update Redux with the new data. Can someone give me some insights on what is the best way to go about it?
I was able to load a .db file using the node module sqlite3 and included a javascript function as such:
var sqlite3 = require('sqlite3').verbose();
let dbSrc = 'processlist.db';
var fetchDBData = (tablename) => {
var db = new sqlite3.Database(dbSrc);
var queries = [];
db.each("SELECT * FROM " + tablename, function(err, row) {
queries.push(row);
});
db.close();
return queries;
};
Since I am using React and Redux for my front end, I was able to invoke this function by calling
window.fetchDBData(tablename);
So I have this web-app using angularJS and nodeJS. I don't want to just use localhost to demo my project because it doesn't looks cool at all when I type "node server.js" and then go to localhost.....
Since I intend to use Firebase for the data, I have noticed that Firebase provides hosting. I tried it, but it seems to only host the index.html and not through/using server.js. I have customized files for the server to use/update. So, how can I tell Firebase Hosting to use my server and related files when hosting?
Is it possible to tell Firebase, hey, run "node server.js" to host my index.html?
I'm guessing by the way you are wording the question you want to see this site from "the internet".
Two routes you could go here.
a) Serve your index through Firebase hosting. Firebase only hosts assets. If your Angular app is being served through Node then you will need to change your architecture to be more SPA-ish
SPA-ish would be like an index bootstrap that interacts with the backend purely through API's.
You would host the API server on something more appropriate like through Nodejitsu.
b) Serve the whole thing through something like Nodejitsu (hosting platform) or your very own VM managed by a different kind of hosting company like BuyVM.net.
Another idea, is if your nodejs app is independent of the angularjs app (however they use shared data, and perform operations on that data model) you could separate the two and connect them only via firebase.
Firebase hosting -> index.html and necessary angularjs files.
Locally (your PC) -> server.js which just connects to firebase and trigger on changed data.
I have done this for a few projects and it's a handy way to access the outside world (internet) while maintaining some semblence of security by not opening ports blindly.
I was able to do this to control a chromecast at my house while at a friends house
Here's an example from my most recent project (I'm trying to make a DVR).
https://github.com/onaclov2000/webdvr/blob/master/app.js
var FB_URL = '';
var Firebase = require('firebase');
var os = require('os')
var myRootRef = new Firebase(FB_URL);
var interfaces = os.networkInterfaces();
var addresses = [];
for (k in interfaces) {
for (k2 in interfaces[k]) {
var address = interfaces[k][k2];
if (address.family == 'IPv4' && !address.internal) {
addresses.push(address.address)
}
}
}
// Push my IP to firebase
// Perhaps a common "devices" location would be handy
var ipRef = myRootRef.push({
"type": "local",
"ip": addresses[0]
});
myRootRef.on('child_changed', function(childSnapshot, prevChildName) {
// code to handle child data changes.
var data = childSnapshot.val();
var localref = childSnapshot.ref();
if (data["commanded"] == "new") {
console.log("New Schedule Added");
var schedule = require('node-schedule');
var date = new Date(data["year"], data["month"], data["day"], data["hh"], data["mm"], 0);
console.log(date);
var j = schedule.scheduleJob(date, function(channel, program, length){
console.log("Recording Channel " + channel + " and program " + program + " for " + length + "ms");
}.bind(null, data["channel"], data["program"], data["length"]));
localref.update({"commanded" : "waiting"});
}
});
When I change my "commanded" data at the FB_URL, to "new" (which can be accomplished by angularjs VERY Simply, using an ng-click operation for example) it'll schedule a recording for a particular date and time (not all actually functional at the moment).
I might be late but since 3 years have passed there is an solution available now from Firebase in the form of cloud functions
Its not straight forward but looks promising if one can refactor their code a bit
I've been trying to upload a user profile picture during the user registration using Service 3 and so far I haven't had any luck. I tested passing a hard coded fid in the field "picture" and also tried to pass the fields "filemime", "filename", etc. and it didn't work neither.
Any idea about what fields I need to populate?
I guess it must be related to this post Using Drupal Services and DIOS SDK for setting user picture in iOS app
but it doesn't have and answer neither.
I use following code -
//Picture is first uploaded from ios using file service and fid is passed in $data object
$file_contents_object = file_load($data->picture_fid);
$file_contents_array['fid'] = $file_contents_object->fid;
$file_contents_array['uid'] = $file_contents_object->uid;
$file_contents_array['filename'] = $file_contents_object->filename;
$file_contents_array['uri'] = $file_contents_object->uri;
$file_contents_array['filemime'] = $file_contents_object->filemime;
$file_contents_array['filesize'] = $file_contents_object->filesize;
$file_contents_array['status'] = $file_contents_object->status;
$file_contents_array['timestamp'] = $file_contents_object->timestamp;
$file_contents_array['rdf_mapping'] = $file_contents_object->rdf_mapping;
//use your cck field here
$user_obj->field_user_profile_picture['und'][0] = $file_contents_array;
Hope this will help.
I want to rerieve list of Metadata Component's like ApexClass using Salesforce Metadata API's.
I'm getting list of all the Apex Classes(total no is 2246) that are on the Salesforce using the following Code and its taking too much time to retrieve these file names:
ListMetadataQuery query = new ListMetadataQuery();
query.type = "ApexClass";
double asOfVersion = 23.0;
// Assume that the SOAP binding has already been established.
FileProperties[] lmr = metadataService.listMetadata(
new ListMetadataQuery[] { query }, asOfVersion);
if (lmr != null)
{
foreach(FileProperties n in lmr)
{
string filename = n.fileName;
}
}
My requirement is to get list of Metadata Components(Apex Classes) which are developed by my organizasion only so that i can get the Salesforce Metadata Components which are relevant to me and possibly can save my time by not getting all the classes.
How can I Achieve this?
Reply as soon as possible.
Thanks in advance.
I've not used the meta-data API directly, but I'd suggest either trying to filter on the created by field, or use a prefixed name on your classes so you can filter on that.
Not sure if filters are possible though! As for speed, my experience of using the Meta-Data API via Eclipse is that it's always pretty slow and there's not much you can do about it!