I'm building a little script to import Facebook data directly fetched from the Facebook Api in a Google Sheet. I've been able to successfully authenticate and pass the response of the API call in my Google App script log. However I don't really know how should I proceed to actually pass this data in my sheet.
The following script allow me to log the data from the fb api :
function myFunction() {
var response = UrlFetchApp.fetch("https://graph.facebook.com/v2.3/page_id/insights/page_impressions?access_token=mytoken");
var json = response.getContentText(); //
var rawdata = JSON.parse(json);
var rawdata = Utilities.jsonParse(response.getContentText());
Logger.log(rawdata);
then I think what I should do is pushing the data I need in a new array with the push method :
var data = [];
data.push(rawdata.???) // don't really the name I need to access the data I want
SpreadsheetApp.getActiveSheet().appendRow(data)
}
My question :
Since I don't really know how to read the data fetched from the facebook api, I don't really know how to push them in my data array. Here is the result of my log :
[16-11-21 10:01:33:622 PST] {data=[{period=day,
values=[{end_time=2016-11-17T08:00:00+0000, value=26023596},
{end_time=2016-11-18T08:00:00+0000,value=24447386},
{end_time=2016-11-19T08:00:00+0000, value=31057386}],
name=page_impressions, description=Daily: The number of impressions seen of any content associated with your Page. (Total Count),
id=page_id/insights/page_impressions/day,
title=Daily Total Impressions},
{period=week,
values=[{end_time=2016-11-17T08:00:00+0000, value=233007217},{end_time=2016-11-18T08:00:00+0000, value=200263630},
{end_time=2016-11-19T08:00:00+0000, value=194289364}],
name=page_impressions,
description=Weekly: The number of impressions seen of any content associated with your Page. (Total Count),
id=page_id/insights/page_impressions/week, title=Weekly Total Impressions},
{period=days_28, values=[{end_time=2016-11-17T08:00:00+0000, value=867302439},
{end_time=2016-11-18T08:00:00+0000, value=868201060}, {end_time=2016-11-19T08:00:00+0000, value=874965509}],
name=page_impressions, description=28 Days: The number of impressions seen of any content associated with your Page. (Total Count),
id=page_id/insights/page_impressions/days_28, title=28 Days Total Impressions}],
paging={next=https://graph.facebook.com/v2.3/page_id/insights/page_impressions?access_token=my_token,
previous=https://graph.facebook.com/v2.3/page_id/insights/page_impressions?access_token=my_token}}
If an example is needed to illustrate an answer let's say I want the Daily Total Impressions for the three days facebook gave me.
I hope my issue is clear enough. Not sure which tag should I use for this question. Correct me if I'm wrong.
thanks !
You need to decide what data to push into an array to then drop into a sheet. I would first drop your raw log file into a JSON prettifier so it's easy to see the formatting. Then, take a look at how to manipulate JSON using apps script.
Related
I am aiming to take a file a user attaches through an Lightning Component and create a document object containing the data.
So far I have overcome the request size limits by chunking the data being uploaded into 1MB chunks. When the Apex Aura method receives these chunks of data it will either create a new document (if it is the first chunk), or will retrieve the existing document and add the new chunk to the end.
Data is received Base64 encoded, and then decoded server-side.
As the document data is stored as a Blob, the original file contents will be read as a String, and then appended with the chunk received. The new contents are then converted back into a Blob to be stored within the ContentVersion object.
The problem I'm having is that strings in Apex have a maximum length of 6,000,000 or so. Whenever the file size exceeds 6MB, this limit is hit during the concatenation, and will cause the file upload to halt.
I have attempted to avoid this limit by converting the Blob to a String only when necessary for the concatenation (as suggested here https://developer.salesforce.com/forums/?id=906F00000008w9hIAA) but this hasn't worked. I'm guessing it was patched because it's still technically allocating a string larger then the limit.
Code's really simple when appending so far:
ContentVersion originalDocument = [SELECT Id, VersionData FROM ContentVersion WHERE Id =: <existing_file_id> LIMIT 1];
Blob originalData = originalDocument.VersionData;
Blob appendedData = EncodingUtil.base64Decode(<base_64_data_input>);
Blob newData = Blob.valueOf(originalData.toString() + appendedData.toString());
originalDocument.VersionData = newData;
You will have hard time with it.
You could try offloading the concatenation to asynchronous process (#future/Queueable/Schedulable/Batchable), they'll have 12MB RAM instead of 6. Could buy you some time.
You could try cheating by embedding an iframe (Visualforce or lightning:container tag? Or maybe a "canvas app") that would grab your file and do some manual JavaScript magic calling normal REST API for document upload: https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/dome_sobject_insert_update_blob.htm (last code snippet is about multiple documents). Maybe jsforce?
Can you upload it somewhere else (SharePoint? Heroku?) and have that system call into SF to push them (no Apex = no heap size limit). Or even look "Files Connect" up.
Can you send an email with attachments? Crude but if you write custom Email-to-Case handler class you'll have 36 MB of RAM.
You wrote "we needed multiple files to be uploaded and the multi-file-upload component provided doesn't support all extensions". That may be caused by these:
In Experience Builder sites, the file size limits and types allowed follow the settings determined by site file moderation.
lightning-file-upload doesn't support uploading multiple files at once on Android devices.
if the Don't allow HTML uploads as attachments or document records security setting is enabled for your organization, the file uploader cannot be used to upload files with the following file extensions: .htm, .html, .htt, .htx, .mhtm, .mhtml, .shtm, .shtml, .acgi, .svg.
Using a Postman to retrieve data from our project management platform that provides collections (Teamwork)
I retrieve a first list of project ID from the Get request using the following code in the Test of that first Get request :
`var jsonData = JSON.parse(responseBody);
var list = (jsonData.projects).length;
var a=[];
for (var i = 0; i < list; i++)
{
var counter = jsonData.projects[i];
IDs=counter.id
a.push(IDs)
}
postman.setEnvironmentVariable("id", a);`
That create a variable id which contains a list of id.
After that, I want to go through each of these id in the following request (replacing {id})
{{Domain}}/projects/{id}/rates.json
Domain is set in the environment variable and is working.
What code and where do i need to put it (Pre-script? Test?) so I can go through the list? That second get request would give me the employee rates in each project (identified by those id)
Thanks for your help
If you want to use the list of variables you extract from the first GET in URLs for subsequent calls, then I think you would need to use the pm.sendRequest option in the 'Test' tab of your first GET.
There is a really good example in this thread:
How to run one request from another using Pre-request Script in Postman
Note: The pre-req tab is executed before the API call is made and the test tab is executed after the API call is made.
Also, "postman." is using the old API, you would benefit from using the newer API which is "pm." so for example;
pm.environment.set("variable_key", "variable_value");
More info on this can be found here:
https://learning.postman.com/docs/sending-requests/variables/
Actually am new in react and am trying to create an event app in which a user can join an event
here is code for joining an event
export const JoinEvent = (id) => {
return async dispatch => {
let data = await firebase.firestore().collection('Events').doc(id).get()
let tmpArray = data.data()
let currentUser = firebase.auth().currentUser
let newArray = tmpArray.PeopleAttending
await firebase.firestore().collection('Events').doc(id).update({
PeopleAttending : {...newArray, [currentUser.uid]: {displayName : currentUser.displayName}}
})
}
}
actually i have created an action bascailly in JoinEvent an id is passed of the particular event which is clicked.
here is my firestore structure look like this..
so basically i have to download the whole data and store in local array and then add new user and then finally update
So here am basically download the whole data is there any way to just simply add new Object without downloading whole data??
thankyou
You are doing it wrong. Firestore document size limit is Maximum size for a document 1 MiB (1,048,576 bytes), so sooner or later you're going to reach that limit if you keep adding data like this. It may seems that you're not going to reach that limit, but it's very unsafe to store data that way. You can check Firestore query using an object element as parameter how to query objects in firestore documents, but I suggest you don't do it that way.
The proper way to do it, is to create a subcollection PeopleAttending on each document inside the Events collection and then use that collection to store the data.
Also you can try document set with merge or mergeFields like documented here https://googleapis.dev/nodejs/firestore/latest/DocumentReference.html#set and here https://stackoverflow.com/a/46600599/1889685.
When uploading to GCS (Google Cloud Storage) using the BlobStore's createUploadURL function, I can provide a callback together with header data that will be POSTed to the callback URL.
There doesn't seem to be a way to do that with GCS's signed URL's
I know there is Object Change Notification but that won't allow the user to provide upload specific information in the header of a POST, the way it is possible with createUploadURL's callback.
My feeling is, if createUploadURL can do it, there must be a way to do it with signed URL's, but I can't find any documentation on it. I was wondering if anyone may know how createUploadURL achieves that callback calling behavior.
PS: I'm trying to move away from createUploadURL because of the __BlobInfo__ entities it creates, which for my specific use case I do not need, and somehow seem to be indelible and are wasting storage space.
Update: It worked! Here is how:
Short Answer: It cannot be done with PUT, but can be done with POST
Long Answer:
If you look at the signed-URL page, in front of HTTP_Verb, under Description, there is a subtle note that this page is only relevant to GET, HEAD, PUT, and DELETE, but POST is a completely different game. I had missed this, but it turned out to be very important.
There is a whole page of HTTP Headers that does not list an important header that can be used with POST; that header is success_action_redirect, as voscausa correctly answered.
In the POST page Google "strongly recommends" using PUT, unless dealing with form data. However, POST has a few nice features that PUT does not have. They may worry that POST gives us too many strings to hang ourselves with.
But I'd say it is totally worth dropping createUploadURL, and writing your own code to redirect to a callback. Here is how:
Code:
If you are working in Python voscausa's code is very helpful.
I'm using apejs to write javascript in a Java app, so my code looks like this:
var exp = new Date()
exp.setTime(exp.getTime() + 1000 * 60 * 100); //100 minutes
json['GoogleAccessId'] = String(appIdentity.getServiceAccountName())
json['key'] = keyGenerator()
json['bucket'] = bucket
json['Expires'] = exp.toISOString();
json['success_action_redirect'] = "https://" + request.getServerName() + "/test2/";
json['uri'] = 'https://' + bucket + '.storage.googleapis.com/';
var policy = {'expiration': json.Expires
, 'conditions': [
["starts-with", "$key", json.key],
{'Expires': json.Expires},
{'bucket': json.bucket},
{"success_action_redirect": json.success_action_redirect}
]
};
var plain = StringToBytes(JSON.stringify(policy))
json['policy'] = String(Base64.encodeBase64String(plain))
var result = appIdentity.signForApp(Base64.encodeBase64(plain, false));
json['signature'] = String(Base64.encodeBase64String(result.getSignature()))
The code above first provides the relevant fields.
Then creates a policy object. Then it stringify's the object and converts it into a byte array (you can use .getBytes in Java. I had to write a function for javascript).
A base64 encoded version of this array, populates the policy field.
Then it is signed using the appidentity package. Finally the signature is base64 encoded, and we are done.
On the client side, all members of the json object will be added to the Form, except the uri which is the form's address.
var formData = new FormData(document.forms.namedItem('upload'));
var blob = new Blob([thedata], {type: 'application/json'})
var keys = ['GoogleAccessId', 'key', 'bucket', 'Expires', 'success_action_redirect', 'policy', 'signature']
for(field in keys)
formData.append(keys[field], url[keys[field]])
formData.append('file', blob)
var rest = new XMLHttpRequest();
rest.open('POST', url.uri)
rest.onload = callback_function
rest.send(formData)
If you do not provide a redirect, the response status will be 204 for success. But if you do redirect, the status will be 200. If you got 403 or 400 something about the signature or policy maybe wrong. Look at the responseText. If is often helpful.
A few things to note:
Both POST and PUT have a signature field, but these mean slightly different things. In case of POST, this is a signature of the policy.
PUT has a baseurl which contains the key (object name), but the URL used for POST may only include bucket name
PUT requires expiration as seconds from UNIX epoch, but POST wants it as an ISO string.
A PUT signature should be URL encoded (Java: by wrapping it with a URLEncoder.encode call). But for POST, Base64 encoding suffices.
By extension, for POST do Base64.encodeBase64String(result.getSignature()), and do not use the Base64.encodeBase64URLSafeString function
You cannot pass extra headers with the POST; only those listed in the POST page are allowed.
If you provide a URL for success_action_redirect, it will receive a GET with the key, bucket and eTag.
The other benefit of using POST is you can provide size limits. With PUT however, if a file breached your size restriction, you can only delete it after it was fully uploaded, even if it is multiple-tera-bytes.
What is wrong with createUploadURL?
The method above is a manual createUploadURL.
But:
You don't get those __BlobInfo__ objects which create many indexes and are indelible. This irritates me as it wastes a lot of space (which reminds me of a separate issue: issue 4231. Please go give it a star)
You can provide your own object name, which helps create folders in your bucket.
You can provide different expiration dates for each link.
For the very very few javascript app-engineers:
function StringToBytes(sz) {
map = function(x) {return x.charCodeAt(0)}
return sz.split('').map(map)
}
You can include succes_action_redirect in a policy document when you use GCS post object.
Docs here: Docs: https://cloud.google.com/storage/docs/xml-api/post-object
Python example here: https://github.com/voscausa/appengine-gcs-upload
Example callback result:
def ok(self):
""" GCS upload success callback """
logging.debug('GCS upload result : %s' % self.request.query_string)
bucket = self.request.get('bucket', default_value='')
key = self.request.get('key', default_value='')
key_parts = key.rsplit('/', 1)
folder = key_parts[0] if len(key_parts) > 1 else None
A solution I am using is to turn on Object Changed Notifications. Any time an object is added, a Post is sent to a URL - in my case - a servlet in my project.
In the doPost() I get all info of objected added to GCS and from there, I can do whatever.
This worked great in my App Engine project.
I am designing a forum and have a layout like this in on my Firebase:
root
|-posts
|-postID1
|-creator: "userOne"
|-creatorUID: "simplelogin:1"
|-text: "Some Text"
|-postID2
|-creator: "userTwo"
|-creatorUID: "simplelogin:2"
|-text: "Some Other Text"
|-profile
|-simplelogin:1
|-firstName: "John"
|-user: "userOne"
|-simplelogin:2
|-firstName: "Sue"
|-user: "userTwo"
On my forum page. I simply use a Angular ng-repeat to get all of the posts on Firebase and list them out. I also want to print out the first name of whoever created the post, but right now, I can only access {{ post.creator }}, which just gives the username of the person who posted. How can I link the post's creator (or creatorUID) with the first name field of that person's profile?
If you're just displaying the the users firstName I would place the users name in the postIDX object.
This would be quicker and produce less requests to Firebase with you going back and fourth with each post to get the usersFirst name.
more information on structuring data and best practices can be found here: https://www.firebase.com/docs/web/guide/structuring-data.html
Updated from response
if you wanted to get the user details then within every request to the postIDx you'd need to do something similar to this (not tested and quick mock up).
var fbRef = new Firebase('firebase path'),
postDetailsObject = {};
fbRef.child('posts').once('value', function(snapshot) {
// loop through each post
snapshot.forEach(function(childSnapshot){
var postDetails = childSnapshot.val(),
profileDetails;
postDetailsObject.post = postDetails;
fbRef.child('profile/' + postDetails.creatorUID).once('value', function(profileData) {
postDetailsObject.profile = profileData;
});
})
});
Then return the postDetailsObject in to angular so you can loop through the single object.