I have created windows flutter windows application now trying to save local data. I used shared_preferences plugin but it does not support windows applications so I found another package to save local data, sqflite_common_ffi plugin.
But don't know how to use it. I followed it's example and it worked but now want to separate it's all functionalities like I want to create a database in a method , and insert values in another method, get values in another method and delete them in another method.
CODE
import 'package:sqflite_common/sqlite_api.dart';
import 'package:sqflite_common_ffi/sqflite_ffi.dart';
class Storages{
var databaseFactory = databaseFactoryFfi;
Database db;
createDatabase ()async{
// Init ffi loader if needed.
sqfliteFfiInit();
db = await databaseFactory.openDatabase(inMemoryDatabasePath);
await db.execute('''
CREATE TABLE Credential(
id INTEGER PRIMARY KEY,
token TEXT,
password TEXT
)
''');
}
saveToken(String token)async{
await db.insert('Credential', <String, dynamic>{'token': token});
}
savePassword(String password)async{
await db.insert('Credential', <String, dynamic>{'password': password});
}
getToken()async{
var result = await db.query('Credential');
print('Credential token database;$result');
// await db.close();
}
getPassword()async{
var result = await db.query('Credential');
print('Credential password database;$result');
}
}
then first I initialized database saved the password in database from login screen but when I restart my application
and I called getpassword method from initState of my splash screen.
Storages storage = new Storages();
#override
void initState() {
super.initState();
storage.getPassword();
}
and thrown me this exception in my terminal
Unhandled Exception: NoSuchMethodError: The method 'query' was called on null.
Receiver: null
Tried calling: query("Credential")
Related
Is there any way we can send email alerts if stored procedure fails in Snowflake?
When I checked snowflake documentation, there is no mention of email utility in Snowflake
You can send email directly from Snowflake optionally sending data from a table/view as an attachment. This is done using Snowflake External function which in turn calls an AWS Lambda function via AWS Gateway.
The first step is to setup the AWS Gateway. You may follow instructions below:
Creating a Customizable External Function on AWS
If you got the sample function working from Snowflake, you have successfully setup the foundation for adding email functionality. Next is to setup an S3 bucket to create datafiles that need to be sent as email-attachment.
Create an AWS S3 bucket with the name 'snowapi'. We need not expose this bucket to the internet, so keep 'Block all public access' set to ON.
Now you need to provide Snowflake access to this Bucket. Create an IAM user 'snowflake'. Add Permissions -> Attach exiting Policy: AmazonS3FullAccess. Go to 'Security Credetials' tab and 'Create access key'. Use the Access Key ID and Secret Access Key in the below command to unload data into S3 bucket.
CREATE OR REPLACE UTIL.AWS_S3_STAGE URL='s3://snowapi/'
CREDENTIALS=(AWS_KEY_ID='ABCD123456789123456789'
AWS_SECRET_KEY='ABCD12345678901234567890123456789');
COPY INTO #UTIL.AWS_S3_STAGE/outbound/SampleData.csv
FROM
FILE_FORMAT =
OVERWRITE = TRUE
SINGLE = TRUE;
Next step is to create a new Lambda function using the Nodejs code below. Note that this uses SENDGRID API. Sendgrid has a forever-free tier with 100 emails per day. I installed this library locally and uploaded the zip file to AWS to create the Lambda function.
//Lambda Function name: email
const sgMail = require('#sendgrid/mail');
var AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.handler = async (event, context, callback) => {
sgMail.setApiKey(process.env.SENDGRID_KEY);
const paramArray = JSON.parse(event.body).data[0];
//paramArray[0] has the row number from Snowflake
var message = {
to: paramArray[1].replace(/\s/g, '').split(','),
from: paramArray[2].replace(/\s/g, ''),
subject: paramArray[3],
html: paramArray[4]
};
// Attach file
if (paramArray.length > 5) {
var fileName = paramArray[5].substring(paramArray[5].lastIndexOf("/")+1);
var filePath = paramArray[5].substring(0, paramArray[5].lastIndexOf("/"));
try {
const params = {Bucket: process.env.BUCKET_NAME + filePath, Key: fileName};
const data = await s3.getObject(params).promise();
var fileContent = data.Body.toString('base64');
} catch (e) {
throw new Error(`Could not retrieve file from S3: ${e.message}`);
}
message.attachments = [{content: fileContent,
filename: fileName,
type: "application/text",
disposition: "attachment"
}];
}
try{
await sgMail.send(message);
return {
'statusCode': 200,
'headers': { 'Content-Type': 'application/json' },
'body' : "{'data': [[0, 'Email Sent to "+ paramArray[1] + "']]}"
};
} catch(e){
return {
'statusCode': 202,
'headers': { 'Content-Type': 'application/json' },
'body' : "{'data': [[0, 'Error - " + e.message + "']]}"
};
}
};
Set the below two environment variables for the Lambda function:
SENDGRID_KEY: <sendgrid_api_key>
BUCKET_NAME: snowapi
Create a Snowflake External Function:
create or replace external function util.aws_email
(mailTo varchar,mailFrom varchar,subject varchar,htmlBody varchar,fileName varchar)
returns variant
api_integration = aws_api_integration
as 'https://xxxxxxxxxx.execute-api.us-east-1.amazonaws.com/PROD/email';
Create a wrapper Procedure for the above external function:
create or replace procedure util.sendemail
(MAILTO varchar,MAILFROM varchar,SUBJECT varchar,HTMLBODY varchar,FILENAME varchar)
returns string
language javascript
EXECUTE AS OWNER
as
$$
// Call the AWSLambda function.
var qry = "select util.aws_email(:1,:2,:3,:4,:5)";
// null should be in lowercase.
var stmt = snowflake.createStatement({sqlText: qry,
binds: [MAILTO,
MAILFROM||'no-reply#yourdomain.com',
SUBJECT ||'Email sent from Snowflake',
HTMLBODY||'<p>Hi there,</p> <p>Good luck!</p>',
FILENAME||null]
});
var rs;
try{
rs = stmt.execute();
rs.next();
return rs.getColumnValue(1);
}
catch(err) {
throw "ERROR: " + err.message.replace(/\n/g, " ");
}
$$;
All Set! The end result is a clean call that sends email like below.
Call SENDEMAIL('to_email#dummy.com, to_another_email#dummy.com',
'from#yourdomain.com',
'Test Subject',
'Sample Body');
Good Luck!!
I believe there is no email utility in Snowflake, But you can run your snowflake stored procedure using python and check the stored procedure status, based on the status you can trigger mail from python.
Sending Email Notifications:
This feature uses the notification integration object, which is a Snowflake object that provides an interface between Snowflake and third-party services (e.g. cloud message queues, email, etc.). A single account can define a maximum of ten email integrations and enable one or more simultaneously.
To create an email notification integration, use the CREATE
NOTIFICATION INTEGRATION command with TYPE=EMAIL:
CREATE [ OR REPLACE ] NOTIFICATION INTEGRATION [IF NOT EXISTS]
<integration_name>
TYPE=EMAIL
ENABLED={TRUE|FALSE}
ALLOWED_RECIPIENTS=('<email_address_1>' [, ... '<email_address_N>'])
[ COMMENT = '<string_literal>' ]
;
After creating the email notification integration, you can call SYSTEM$SEND_EMAIL() to send an email notification, as follows:
CALL SYSTEM$SEND_EMAIL(
'<integration_name>',
'<email_address_1> [, ... <email_address_N>]',
'<email_subject>',
'<email_content>'
);
...
For example:
CALL SYSTEM$SEND_EMAIL(
'my_email_int',
'person1#example.com, person2#example.com',
'Email Alert: Task A has finished.',
'Task A has successfully finished.\nStart Time: 10:10:32\nEnd Time: 12:15:45\nTotal Records Processed: 115678'
);
We use the snowsql command from bash scripts, and use the "-o exit_on_error=true" option on the command line, checking the return code at the end of the step. If the Snowflake commands have failed, then the exit on error setting will mean that Snowflake will stop at the point of the error and return control to the calling program.
If the return code is zero, then we move onto the next step.
If it is non-zero, then we call an error handler which sends an email and then quits the job.
We're on Amazon Linux for our orchestration, and we use mutt as an email application.
there guys, I do have an interesting problem here and I would be really glad if any of you it will be able to help me with that.
What's my app flow:
Register with the email, password and some other details:
User firebase in order to auth the user and create an account via email and password, at the same time I'm writing the custom data of the user to the database.
Log in the user.
That's it, that's all my basic logic, and how you can see I'm not doing any reading from the DB so far as I know.
Now... the problem is that from some weird reason when I'm registering my user I'm going to the firebase console to see the usage of my DB and I will see something like... for one user which was created I will have 1 write (which is fine as I was expected) but also 13-20 READS FROM DB.
Now that's my question, WHY on earth I have reads on firestorm when I'm doing just auth and writes?
Here it's my DB code which I'm using right now.
class DatabaseFirebase implements BaseDataBase {
final FirebaseAuth _firebaseAuth = FirebaseAuth.instance;
final FirebaseStorage _storage = FirebaseStorage.instance;
FirebaseUser _firebaseUser;
Firestore _firestore = Firestore.instance;
#override
Future<String> login(String email, String password) async {
_firebaseUser = await _firebaseAuth.signInWithEmailAndPassword(
email: email, password: password);
return _firebaseUser.uid;
}
#override
Future<String> register(String email, String password) async {
_firebaseUser = await _firebaseAuth.createUserWithEmailAndPassword(
email: email, password: password);
return _firebaseUser.uid;
}
#override
Future<UserData> getCurrentUser() async {
if (_firebaseUser == null)
_firebaseUser = await _firebaseAuth.currentUser();
UserData user = UserData();
user.email = _firebaseUser?.email;
user.name = _firebaseUser?.displayName;
return user;
}
#override
Future<void> logout() async {
_firebaseAuth.signOut();
}
#override
Future<void> onAuthStateChanged(void Function(FirebaseUser) callback) async {
_firebaseAuth.onAuthStateChanged.listen(callback);
}
#override
Future<void> writeUser(UserData user) async {
_firestore.collection("Users").add(user.toMap()).catchError((error) {
print(error);
});
}
}
If some of you know could you explain to me where/how I need to search in order to find this bug? Because how you can see I'm not using any read what so ever.
It's impossible to know for sure given that we don't understand all possible routes of access into your database, but you should be aware that use of the Firebase console will incur reads. If you leave the console open on a collection/document with busy write activity, the console will automatically read the changes that update the console's display. This is very often the source of unexpected reads.
Without full reproduction steps of exactly all the steps you're taking, there's no way to know for sure.
Firebase currently does not provide tools to track the origin of document reads. If you need to measure specific reads from your app, you will have to track that yourself somehow.
This is the data that is being send by alexa to my skills backend. In the backend code I would like to test if the session is new or not and based on this information I would like to produce a different output. I try to access the session.new, but I don't know how and I could now find anything about it online so far.
const { attributesManager } = handlerInput;
const requestAttributes = attributesManager.getRequestAttributes();
requestAttributes.session.new
//this leads to the error "cannot read property new of undefined"
const { attributesManager } = handlerInput;
const requestAttributes = attributesManager.getSessionAttributes();
sessionAttributes.session.new
//this leads to the same error
I finally found out that this is not possible using the AttributesManager because this only allows access to request, session and permanent attributes. But session.new is not part of those. If you want to check if the Alexa session is new you have to use the so called RequestEnvelopeUtils.
Using those you can access if the session is new or not by using the following statement.
(Alexa.isNewSession(handlerInput.requestEnvelope))
How would you go about updating NetSuite through Salesforce.
I know that you would use NetSuite's RESTlet and Salesforce Apex code to connect the two, but how would you actually go about doing this in a step by step process?
To send data from Salesforce to NetSuite (specifically customer/account data) you will need to do some preliminary setup in both.
In NetSuite:
Create a RESTlet Script that has at the bare minimum a get and post.
For instance I would create a javascript file on my desktop that contains:
/**
*#NApiVersion 2.x
*#NScriptType restlet
*/
//Use: Update NS customer with data (context) that is passed from SF
define(['N/record'], function(record) //use the record module
{
function postData(context)
{
//load the customer I'm gonna update
var cust = record.load({type:context.recordtype, id:context.id});
log.debug("postData","loaded the customer with NSID: " + context.id);
//set some body fields
cust.setValue("companyname", context.name);
cust.setValue("entityid", context.name + " (US LLC)");
cust.setValue("custentity12", context.formerName);
cust.setValue("phone",context.phone);
cust.setValue("fax",context.fax);
//remove all addresses
while(cust.getLineCount('addressbook') != 0)
cust.removeLine('addressbook',0);
//add default billing address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultbilling',0,true);
cust.setSublistValue('addressbook','label',0,'BILL_TO');
var billingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
billingAddress.setValue('country',context.billingCountry);
billingAddress.setValue('addr1', context.billingStreet);
billingAddress.setValue('city',context.billingCity);
billingAddress.setValue('state',context.billingState);
billingAddress.setValue('zip',context.billingZip);
//add default shipping address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultshipping',0,true);
cust.setSublistValue('addressbook','label',0,'SHIP_TO');
var shippingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
shippingAddress.setValue('country',context.shippingCountry);
shippingAddress.setValue('addr1',context.shippingStreet);
shippingAddress.setValue('city',context.shippingCity);
shippingAddress.setValue('state',context.shippingState);
shippingAddress.setValue('zip',context.shippingZip);
//save the record
var NSID = cust.save();
log.debug("postData","saved the record with NSID: " + NSID);
return NSID; //success return the ID to SF
}
//get and post both required, otherwise it doesn't work
return {
get : function (){return "get works";},
post : postData //this is where the sauce happens
};
});
After You've saved this file, go to NetSuite>Customization>Scripting>Scripts>New.
Select the new file that you've saved and create the script record. Your script record in NetSuite should have GET and POST checked under scripts.
Next, click deploy script and choose who will call this script, specifically the user that will login on the salesforce end into NetSuite.
On the deployment page you will need the External URL it goes something like:
https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1
Note: If any data that you are updating is critical for a process, I would highly recommond creating this in the sandbox first for testing, before moving to production.
In Salesforce sandbox:
Click YourName>Developer Console
In the developer console click File>New and create an Apex Class:
global class NetSuiteWebServiceCallout
{
#future (callout=true) //allow restlet callouts to run asynchronously
public static void UpdateNSCustomer(String body)
{
Http http = new Http();
HttpRequest request = new HttpRequest();
request.setEndPoint('https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1'); //external URL
request.setMethod('POST');
request.setHeader('Authorization', 'NLAuth nlauth_account=1234567, nlauth_email=login#login.com, nlauth_signature=password'); //login to netsuite, this person must be included in the NS restlet deployment
request.setHeader('Content-Type','application/json');
request.setBody(body);
HttpResponse response = http.send(request);
System.debug(response);
System.debug(response.getBody());
}
}
You will have to set the endpoint here to the external url, the authorization nlauth_account=[your netsuite account number], and the header to your login email and password to a person who is on the deployment of the NS script, the body will be set in the trigger that calls this class.
Next create the trigger that will call this class. I made this script run every time I update the account in Salesforce.
trigger UpdateNSCustomer on Account (after update)
{
for(Account a: Trigger.new)
{
String data = ''; //what to send to NS
data = data + '{"recordtype":"customer","id":"'+a.Netsuite_Internal_ID__c+'","name":"'+a.Name+'","accountCode":"'+a.AccountCode__c+'",';
data = data + '"formerName":"'+a.Former_Company_Names__c+'","phone":"'+a.Phone+'","fax":"'+a.Fax+'","billingStreet":"'+a.Billing_Street__c+'",';
data = data + '"billingCity":"'+a.Billing_City__c+'","billingState":"'+a.Billing_State_Province__c+'","billingZip":"'+a.Billing_Zip_Postal_Code__c+'",';
data = data + '"billingCountry":"'+a.Billing_Country__c+'","shippingStreet":"'+a.Shipping_Street__c+'","shippingCity":"'+a.Shipping_City__c+'",';
data = data + '"shippingState":"'+a.Shipping_State_Province__c+'","shippingZip":"'+a.Shipping_Zip_Postal_Code__c+'","shippingCountry":"'+a.Shipping_Country__c+'"}';
data = data.replaceAll('null','').replaceAll('\n',',').replace('\r','');
System.debug(data);
NetSuiteWebServiceCallout.UpdateNSCustomer(data); //call restlet
}
}
In this script data is the body that you are sending to NetSuite.
Additionally, you will have to create an authorized endpoint for NetSuite in your remote site setings in salesforce (sandbox and production). Go to setup, and quickfind remote site settings which will be under security controls.
Create a new Remote site that has its remote site URL set to the first half of your external url: https://1234567.restlets.api.netsuite.com.
From here, do some testing in the sandbox.
If all looks well deploy the class and trigger to salesforce production.
I am using Microsoft.Azure.ActiveDirectory.GraphClient version 2.1.1.0 to get groups that my user belongs to.
Method call is like this:
ActiveDirectoryClient activeDirectoryClient = new ActiveDirectoryClient(
new Uri(GraphUrl),
async () => await GetAppTokenAsync());
IEnumerable<string> groups = GetGroupsAsync(activeDirectoryClient, "currentUserObjectId").Result;
private static async Task<IEnumerable<string>> GetGroupsAsync(ActiveDirectoryClient activeDirectoryClient, string currentUserObjectId )
{
return await activeDirectoryClient.Users.GetByObjectId(currentUserObjectId).GetMemberGroupsAsync(true);
}
private static async Task<string> GetAppTokenAsync()
{
var authContext = new Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext(ServiceRoot);
var token = await authContext.AcquireTokenAsync(GraphUrl,new ClientCredential("clientId", "clientSecret"));
return token.AccessToken;
}
However the method hangs even though in Fiddler I see that the request has succedeed and contains correct groups.
My question is duplicate of Azure ActiveDirectory Graph API GraphClient not returning AD Groups. A workaround exists but not a explanation why the method does not work.
If indeed your ServiceRoot value is the same for your instantiation of ActiveDirectoryClient and for your call to AuthenticationContext, that could be the source of your problem.
ActiveDirectoryClient should be instantiated with https://graph.windows.net/
AuthenticationContext should be called with
https://login.microsoftonline.com/
Though that wouldn't manifest itself in the method hanging nor a successful request, that was the only change I had to make to your code for it to work for me, otherwise it would return with a Not Found error.
I've had similar issues with the Graph Api library when using the Result property, try changing your call to this:-
IEnumerable<string> groups = await GetGroupsAsync(activeDirectoryClient, "currentUserObjectId");