fun updateOne (storyModel: StoryModel, id:Int){
val db = this.writableDatabase
val cv = ContentValues()
cv.put("title", storyModel.title)
cv.put("date", storyModel.date)
cv.put("story", storyModel.story)
db.update("STORY_TABLE", cv,"id = ?", arrayOf(id.toString()))
}
I tried to update my data in the app I made but when I closed my app and restart it the updates were not displayed on the screen.
Related
How to Create WorkFlow Field Update using MetaData Tooling Api
I am creating a metadataservice class and that object . thorw a object i am creating workflow field update but it is not working
MetadataService.MetadataPort service = new MetadataService.MetadataPort();
service.SessionHeader = new MetadataService.SessionHeader_element();
service.SessionHeader.sessionId = UserInfo.getOrganizationId().substring(0, 15) + ' ' + UserInfo.getSessionId().substring(15);
MetadataService.WorkflowFieldUpdate workflowFieldUpdate = new MetadataService.WorkflowFieldUpdate();
// Workflow Field Update
workflowFieldUpdate.fullName = 'TEST_Active_Permission';
workflowFieldUpdate.description = 'Activates a permission.';
workflowFieldUpdate.field = 'Expense__c.Status__c';
workflowFieldUpdate.literalValue = '1';
workflowFieldUpdate.name = 'TEST Active Permission';
workflowFieldUpdate.notifyAssignee = false;
workflowFieldUpdate.operation = 'Literal';
workflowFieldUpdate.protected_x = false;
workflowFieldUpdate.reevaluateOnChange = true;
workflowFieldUpdate.targetObject = 'Expense__c';
MetadataService.WorkflowAction wfp = workflowFieldUpdate;
MetadataService.Metadata[] theMetadata = new MetadataService.Metadata[]{};
theMetadata.add(wfp);
MetadataService.SaveResult[] results = service.createMetadata(theMetadata);
system.debug('results'+results);
That's not Tooling API, that's old school Metadata API. Somebody took the metadata API WSDL file and imported it back to SF. What error are you getting?
Keep in mind that since Winter'23 release (~September 2022) you can't create new workflow rules. Button is disabled in UI too. Field updates... you probably still can but why do you cling to retired automation?
https://admin.salesforce.com/blog/2021/go-with-the-flow-whats-happening-with-workflow-rules-and-process-builder
Note that in Metadata API documentation there's no top-level entry for WorkflowFieldUpdate. It's possible you have to create Workflow, wrap your thing in it. https://developer.salesforce.com/docs/atlas.en-us.api_meta.meta/api_meta/meta_workflow.htm Tooling API has separate entry (https://developer.salesforce.com/docs/atlas.en-us.api_tooling.meta/api_tooling/tooling_api_objects_workflowfieldupdate.htm) but you'd need to ditch this hack and use JSON.
I am writing an Electron application and I want to display some data from a local sqlite3 database file. I am using React as my front-end framework and Redux to update the table data. However I am having trouble figuring out what's the best way to query from the .db file and update Redux with the new data. Can someone give me some insights on what is the best way to go about it?
I was able to load a .db file using the node module sqlite3 and included a javascript function as such:
var sqlite3 = require('sqlite3').verbose();
let dbSrc = 'processlist.db';
var fetchDBData = (tablename) => {
var db = new sqlite3.Database(dbSrc);
var queries = [];
db.each("SELECT * FROM " + tablename, function(err, row) {
queries.push(row);
});
db.close();
return queries;
};
Since I am using React and Redux for my front end, I was able to invoke this function by calling
window.fetchDBData(tablename);
I can send a parameter from as3 to asp. And I can get a value from db. But unfortunatelly I cant combine both of them. Is it possible to send a ID parameter from as3 to asp where I want to make a sql query on a db. Then query result will return back to the as3. Users can login with their id number. And they can see their own datas on the as3 application. My sample codes are given:
I can send values with these codes:
var getParams:URLRequest = new URLRequest("http://www***********/data.asp");
getParams.method = URLRequestMethod.POST;
var paras:URLVariables = new URLVariables();
paras.parameter1 = ""+userID;
getParams.data = paras;
var loadPars:URLLoader = new URLLoader(getParams);
loadPars.addEventListener(Event.COMPLETE, loadCompleted);
loadPars.dataFormat = URLLoaderDataFormat.VARIABLES;
loadPars.load(getParams);
function loadCompleted(event:Event):void
{
trace("sent")
}
I can get values from db with these codes:
var urlLoader:URLLoader =new URLLoader();
urlLoader.load(new URLRequest("http://www***********/data.asp"));
urlLoader.dataFormat = URLLoaderDataFormat.VARIABLES;
urlLoader.addEventListener(Event.COMPLETE, onXMLLoad);
function onXMLLoad(event:Event):void
{
var loader:URLLoader = URLLoader(event.target);
var scrptVars:URLVariables = new URLVariables(loader.data +"");
returnParameter= scrptVars.LINK0;
high.HighScore.text = returnParameter + "";
}
What is the logic of combining them?
Sory for my English level :)
To combine the second one into the first, you just need to read the URLLoader's data property (which is the response from the server) on the loadCompleted method (same as you're doing in the onXMLLoad method):
function loadCompleted(event:Event):void
{
trace("sent and received", loadPars.data);
high.HighScore.text = loadPars.data.LINK0;
}
The COMPLETE event for a URLLoader fires once the request has received a response. If your server adds data to that response, it can be found in the data property of the URLLoader.
So to summarize, sending and receiving can be done all in one operation with one URLLoader. The data you send to the server, is found in the URLRequest object passed to the URLLoader, the data that comes back from that request, is found in the data property of the URLLoader object (but only after the COMPLETE event fires).
In an akka-http service, how does one cache some information, per client session? This is not quite obvious in the docs. I would for example like to create an actor for each connection.
Where should I create the actor, and how do I get reference to it from inside my stages?
My service is bound something like this:
val serverSource: Source[Http.IncomingConnection, Future[Http.ServerBinding]] =
Http().bind(interface = bindAddress, port = bindPort)
val bindingFuture: Future[Http.ServerBinding] =
serverSource
.to(Sink.foreach { connection =>
connection.handleWithSyncHandler (requestHandler)
// seems like I should set up some session state storage here,
// such as my actor
})
.run()
...
and later on:
val packetProcessor: Flow[A, B, Unit] = Flow[A]
.map {
case Something =>
// can i use the actor here, or access my session state?
}
I suspect I'm probably misinterpreting the whole paradigm in trying to make this fit. I can't tell if there is anything built in or how much I need to implement manually.
I have found Agent to be a very convenient mechanism for concurrent caching.
Say, for example, you want to keep a running Set of all the remote addresses that you have been connected to. You can setup an Agent to store the values and a Flow to write to the cache:
import scala.concurrent.ExecutionContext.Implicits.global
import akka.agent.Agent
import scala.collection.immutable
val addressCache = Agent(immutable.Set.empty[java.net.InetSocketAddress])
import akka.stream.scaladsl.Flow
val cacheAddressFlow = Flow[IncomingConnection] map { conn =>
addressCache send (_ + conn.remoteAddress) //updates the cache
conn //forwards the connection to the rest of the stream
}
This Flow can then be made part of your Stream:
val bindingFuture: Future[Http.ServerBinding] =
serverSource.via(cacheAddressFlow)
.to(Sink.foreach { connection =>
connection.handleWithSyncHandler (requestHandler)
})
.run()
You can then "query" the cache completely outside of the binding logic:
def somewhereElseInTheCode = {
val currentAddressSet = addressCache.get
println(s"address count so far: ${currentAddressSet.size}")
}
If your goal is to send all IncomingConnection values to an Actor for processing then this can be accomplished with Sink.actorRef:
object ConnectionStreamTerminated
class ConnectionActor extends Actor {
override def receive = {
case conn : IncomingConnection => ???
case ConnectionStreamTerminated => ???
}
}
val actorRef = actorSystem actorOf Props[ConnectionActor]
val actorSink =
Sink.actorRef[IncomingConnection](actorRef, ConnectionStreamTerminated)
val bindingFuture: Future[Http.ServerBinding] =
serverSource.runWith(actorSink)
For the reason that the suggested Agents have been deprecated. I would suggests to use akka-http-session. It makes sure session data is secure and cannot be tampered with.
I have an HTML page that produces data (a table row). I want to store rows from all clients in an online table which can be accessed/downloaded (preferably by the owner alone, so anonymous clients can only add rows)
Possible solutions and encountered problems:
Google spreadsheet + google apps script - How to do a cross origin POST request?
Google fusion tables - How to add rows from anonymous clients? Is that possible?
Google app engine - possible, but seems too time consuming for this simple task.
I've found an answer on how to do cross domain POST, so I've managed to do what I want with apps script + spreadsheet:
Apps script:
function doPost(request) {
var ss = SpreadsheetApp.openById(id);
var sheet = ss.getSheets()[0];
if (request.parameters.row != null) {
sheet.appendRow(Utilities.jsonParse(request.parameters.row));
}
}
Client javascript (from https://stackoverflow.com/a/6169703/378594):
function crossDomainPost(paramsDict, url) {
// Add the iframe with a unique name
var iframe = document.createElement("iframe");
var uniqueString = "SOME_UNIQUE_STRING";
document.body.appendChild(iframe);
iframe.style.display = "none";
iframe.contentWindow.name = uniqueString;
// construct a form with hidden inputs, targeting the iframe
var form = document.createElement("form");
form.target = uniqueString;
form.action = url;
form.method = "POST";
// repeat for each parameter
for (i in paramsDict) {
var input = document.createElement("input");
input.type = "hidden";
input.name = i;
input.value = paramsDict[i];
form.appendChild(input);
}
document.body.appendChild(form);
form.submit();
}
crossDomainPost({'row': JSON.stringify([123, 123, 1211])}, serverURL);
Note that a similar scheme should work for Google fusion tables.