I reviewed the following documentation from Google on how to optimize existing Google scripts here:
https://developers.google.com/apps-script/guides/support/best-practices
In particular, the 'Use batch-operation' section seems more appropriate for my use case, where the optimal strategy is to 'batch' all the reading into one operation, and then writing in separate operation; not to cycle between read-and-write calls.
Here is an example of inefficient code, as given by the url above:
// DO NOT USE THIS CODE. It is an example of SLOW, INEFFICIENT code.
// FOR DEMONSTRATION ONLY
var cell = sheet.getRange('a1');
for (var y = 0; y < 100; y++) {
xcoord = xmin;
for (var x = 0; x < 100; x++) {
var c = getColorFromCoordinates(xcoord, ycoord);
cell.offset(y, x).setBackgroundColor(c);
xcoord += xincrement;
}
ycoord -= yincrement;
SpreadsheetApp.flush();
}
Here is an example of efficient and improved code:
// OKAY TO USE THIS EXAMPLE or code based on it.
var cell = sheet.getRange('a1');
var colors = new Array(100);
for (var y = 0; y < 100; y++) {
xcoord = xmin;
colors[y] = new Array(100);
for (var x = 0; x < 100; x++) {
colors[y][x] = getColorFromCoordinates(xcoord, ycoord);
xcoord += xincrement;
}
ycoord -= yincrement;
}
sheet.getRange(1, 1, 100, 100).setBackgroundColors(colors);
Now, for my particular use case:
Instead of storing values in an array, then writing/modifying them as a separate operation from reading them into an array, I want to create multiple Google documents that replaced placeholders within each document.
For context:
I'm writing a script that reads a spreadsheet of students with files to modify for each student, which is later sent as a mail merge. For example, there are 3 master files. Each student will have a copy of the 3 master files, which is used to .replaceText placeholder fields.
Here are my relevant snippets of code below:
function filesAndEmails() {
// Import the Spreadsheet application library.
const UI = SpreadsheetApp.getUi();
// Try calling the functions below; catch any error messages that occur to display as alert window.
try {
// Prompt and record user's email draft template.
// var emailLinkID = connectDocument(
// UI,
// title="Step 1/2: Google Document (Email Draft) Connection",
// dialog=`What email draft template are you referring to?
// This file should contain the subject line, name and body.
// Copy and paste the direct URL link to the Google Docs:`,
// isFile=true
// );
// TEST
var emailLinkID = "REMOVED FOR PRIVACY";
if (emailLinkID != -1) {
// Prompt and record user's desired folder location to store generated files.
// var fldrID = connectDocument(
// UI,
// title="Step 2/2: Google Folder (Storage) Connection",
// dialog=`Which folder would you like all the generated file(s) to be stored at?
// Copy and paste the direct URL link to the Google folder:`,
// isFile=false
// );
// TEST
var fldrID = DriveApp.getFolderById("REMOVED FOR PRIVACY");
// Retrieve data set from database.
var sheet = SpreadsheetApp.getActive().getSheetByName(SHEET_1);
// Range of data must include header row for proper key mapping.
var arrayOfStudentObj = objectify(sheet.getRange(3, 1, sheet.getLastRow()-2, 11).getValues());
// Establish array of attachment objects for filename and file url.
var arrayOfAttachObj = getAttachments();
// Opportunities for optimization begins here.
// Iterate through array of student Objects to extract each mapped key values for Google document insertion and emailing.
// Time Complexity: O(n^3)
arrayOfStudentObj.forEach(function(student) {
if (student[EMAIL_SENT_COL] == '') {
try {
arrayOfAttachObj.forEach(function(attachment) {
// All generated files will contain this filename format, followed by the attachment filename/description.
var filename = `${student[RYE_ID_COL]} ${student[FNAME_COL]} ${student[LNAME_COL]} ${attachment[ATTACH_FILENAME_COL]}`;
// Create a copy of the current iteration/file for the given student.
var file = DocumentApp.openById(DriveApp.getFileById(getID(attachment[ATTACH_FILEURL_COL], isFile=false)).makeCopy(filename, fldrID).getId())
// Replace and save all custom fields for the given student at this current iteration/file.
replaceCustomFields(file, student);
});
} catch(e) {
}
}
});
UI.alert("Script successfully completed!");
};
} catch(e) {
UI.alert("Error Detected", e.message + "\n\nContact a developer for help.", UI.ButtonSet.OK);
};
}
/**
* Replaces all fields specified by 'attributesArray' given student's file.
* #param {Object} file A single file object used to replace all custom fields with.
* #param {Object} student A single student object that contains all custom field attributes.
*/
function replaceCustomFields(file, student) {
// Iterate through each student's attribute (first name, last name, etc.) to change each field.
attributesArray.forEach(function(attribute) {
file.getBody()
.replaceText(attribute, student[attribute]);
});
// Must save and close file to finalize changes prior to moving onto next student object.
file.saveAndClose();
}
/**
* Processes the attachments sheet for filename and file ID.
* #return {Array} An array of attachment file objects.
*/
function getAttachments() {
var files = SpreadsheetApp.getActive().getSheetByName(SHEET_2);
return objectify(files.getRange(1, 1, files.getLastRow(), 2).getValues());
}
/**
* Creates student objects to contain the object attributes for each student based on
* the header row.
* #param {Array} array A 2D heterogeneous array includes the header row for attribute key mapping.
* #return {Array} An array of student objects.
*/
function objectify(array) {
var keys = array.shift();
var objects = array.map(function (values) {
return keys.reduce(function (o, k, i) {
o[k] = values[i];
return o;
}, {});
});
return objects;
}
To summarize my code, I read the Google spreadsheet of students as an array of objects, so each student has attributes like their first name, last name, email, etc. I have done the same for the file attachments that would be included for each student. Currently, the forEach loop iterates through each student object, creates copies of the master file(s), replaces placeholder text in each file, then saves them in a folder. Eventually, I will be sending these file(s) to each student with the MailApp. However, due to the repetitive external calls via creating file copies for each student, the execution time is understandably very slow...
TLDR
Is it still possible to optimize my code using "batch operations" when it is necessary for my use case to have multiple DriveApp calls to create said copies of the files for modification purposes? As opposed to reading raw values into an array and modifying them at a later operation, I don't think I could simply just store document objects into an array, then modify them at a later stage. Thoughts?
You could use batchUpdate of Google Docs API.
See Tanaike's answer just to have an idea on how the request object looks like.
All you need to do in your Apps Script now is build the object for multiple files.
Note:
You can also further optimize your code by updating:
var arrayOfStudentObj = objectify(sheet.getRange(3, 1, sheet.getLastRow()-2, 11).getValues();
into:
// what column your email confirmation is which is 0-index
// assuming column K contains the email confirmation (11 - 1 = 10)
var emailSentColumn = 10;
// filter data, don't include rows with blank values in column K
var arrayOfStudentObj = objectify(sheet.getRange(3, 1, sheet.getLastRow()-2, 11).getValues().filter(row=>row[emailSentColumn]));
This way, you can remove your condition if (student[EMAIL_SENT_COL] == '') { below and lessen the number of loops.
Resource:
Google Docs Apps Script Quickstart
Google Docs REST API
Related
Whenever I open a PDF-file in Illustrator for edithing, there are a lot of ungrouped and uncategorized Elemetns in it.
So I tried to select multiple elements with a spicific name with the below Script, but since the name of the elements are between Angle-brackets "<someName>" script wont select them:
function selectPageItemsByName(items, name) {
for (var i = 0; i < items.length; i++) {
var item = items[i];
if (item.name === name) {
item.selected = true;
}
}
}
function main() {
var document = app.activeDocument;
var name = '<someFile>';
document.selection = null;
selectPageItemsByName(document.pageItems, name);
}
main();
Femkeblanko from Adobe Community says: Items with angle brackets in their label (unless user-created) are unnamed. They correspond to an empty string, i.e. "".
If I remove the Brackets from the name of the Elemetns, the script works but I have a lot of Elements and it needs time.
So, isn't there a way to salve it?
this is a pretty creative way:
// Select->Objects->Clipping Mask
app.executeMenuCommand("Clipping Masks menu item");
// Edit->Clear
app.executeMenuCommand("clear");
but it isn't really documented very well
some links for future reference:
Where is the perfect reference of adobe illustrator script?
https://ai-scripting.docsforadobe.dev/index.html
https://ten-artai.com/illustrator-ccver-22-menu-commands-list/
https://github.com/ten-A/AiMenuObject
I'm creating a pickList where there's a list of attributes that the user can choose to create its own personal report. But I'm having quite some trouble on creating this XLS, because of two things.
First my XLSX.utils.json_to_sheet is ignoring blank fields (""), and I can't have that because it will desorganize the whole report.
Second: because its a dynamic xls, I can't know how many columns there will be created...
Here's what I've done so far, regarding the xls creation part:
exportTable() {
this.createObjectWithColumnsSelected(); //ignore this, i'm only creating a reference object based on the user columns choices
this.manipulateExportableData(); //manipulating the whole table data, to be formated accordly the reference object
const worksheet = XLSX.utils.json_to_sheet(this.customersTable);
this.changeHeaderWorksheet(worksheet);
}
converte(): string[] {
const alph = "ABCDEFGHIJKLMNOPQRSTUVWXYZABCDEFGHIJKLMNOPQRSTUVWXYZ";
return alph.split('');
}
changeHeaderWorksheet(worksheet: WorkSheet): void {
const alphabet = this.converte();
for (let i = 0; i < this.cols.length; i++) {
if (i >= 26){
console.log(this.cols[i].header);
worksheet[`A${alphabet[i]}1`] = this.cols[i].header;// I think this is quite wrong
}else{
console.log(this.cols[i].header);
worksheet[`${alphabet[i]}1`] = this.cols[i].header; //I think this is quite wrong
}
}
}
Can you guys please help me with writing a gamma script for logging data into a database table? The tags are inside a domain created in OPC DA of Cogent Datahub. The only condition that needs to be satisfied is the script should be logging all points in the domain every one second along with their value and timestamp.
require ("Application");
require ("ODBCThreadSupport");
require ("Time");
require ("Quality");
class LogData Application
{
DSN = "your ODBC DSN name"; // The DSN name to use for the database
connection
username = "Database_User"; // The user name for connecting to the
database
password = "*****"; // The password for connecting to the database
tablename = "table1"; // The name of the database table
cachefile = "C:\Users\AppData\cache.txt"; // Base name for the disk cache file
domain = "Domain name"; // The domain in which to log all points
tableclass;
thread;
mappedPoints = new Dictionary();
prevcount = 0;
}
/* If there is something we only want to perform on the first connection, we can
test
* is_first_connect to perform the code only once.
*/
method LogData.onConnect()
{
princ ("Connection succeeded\n");
if (.thread.is_first_connect)
{
// Start the sequence defined by the AddInitStage calls in the constructor
.thread.BeginAsyncInit();
}
}
/* If the connection fails after having been
* connected, this method is called.
*/
method LogData.onConnectFail()
{
princ ("Connection closed: ", SQLResult.Description, "\n");
}
/* Map the table in the set of table definitions that matches the name in
.tablename
* into a Gamma class. This lets us easily convert between class instances and
rows
* in the table.
*/
method LogData.mapTable(name, tabledefinitions)
{
//princ("Mapping table\n");
.tableclass = .thread.ClassFromTable(name, tabledefinitions);
//princ("Table class = ", .tableclass, "\n");
}
method LogData.startLogging()
{
.registerPoints();
}
/* Set up the timer or event handler functions to write to the table. */
method LogData.registerPoints()
{
/* Find all points in the domain */
local info = datahub_domaininfo(.domain)[0];
if (info.n_points != .prevcount)
{
local points = datahub_points(.domain, nil, nil);
local pointsym;
princ(info.n_points - .prevcount, " new points are being added to
logging\n");
with point in points do
{
// Filter out branch points
if ((point.flags & 0x30) == 0)
{
pointsym = symbol(string(point.domain,":", point.name));
if (!.mappedPoints.contains(pointsym))
{
local PointName = string(pointsym);
.TimerEvery(01,`(#self).writeData(#PointName));
//.mappedPoints.add(pointsym, pointsym);
}
}
}
.prevcount = info.n_points;
}
}
method LogData.writeData(pointsymbol)
{
local row = new (.tableclass);
local pttime, ptltime;
local timestring;
local point;
// Generate a timestamp in database-independent format to the millisecond.
// Many databases strip the milliseconds from a timestamp, but it is harmless
// to provide them in case the database can store them.
point = PointMetadata(pointsymbol);
//princ(point,"\n");
if (point && number_p(point.value))
{
pttime = WindowsTimeToUnixTime(point.timestamp);
//princ(point,"\n");
ptltime = localtime(pttime);
//princ(ptltime,"\n");
if (!ptltime)
ptltime = localtime(0);
timestring = format("{'%04d-%02d-%02d %02d:%02d:%02d'}",
ptltime.year+1900, ptltime.mon+1, ptltime.mday, ptltime.hour, ptltime.min,
ptltime.sec);
//princ(timestring,"\n");
// Fill the row. Since we mapped the table into a Gamma class, we can
access
// the rows in the column as member variables of the mapped class.
row.ptname = string(pointsymbol);
row.ptvalue = point.value;
row.pttime = timestring;
// Perform the insertion. In this case we are providing no callback on
completion.
.thread.Insert(row, nil);
}
}
/* Write the 'main line' of the program here. */
method LogData.constructor ()
{
// Create and configure the database connection object
.thread = new ODBCThread();
.thread.Configure(.DSN, .username, .password, STORE_AND_FORWARD, .cachefile, 0);
// Query the table and map it to a class for each insertion. We want to run an
asynchronous event
// within the asynchronous initialization stage, so to do that we specify the
special method
// cbInitStage as the callback function of our asynchronous event
(GetTableInfo). We deal with
// the return from the GetTableInfo in the onSuccess argument of the init stage.
.thread.AddInitStage(`(#.thread).GetTableInfo("", "", (#.tablename),
"TABLE,VIEW",
`(#.thread).cbInitStage()),
`(#self).mapTable(#.tablename, SQLTables), nil);
//.thread.AddInitStage(`(#.thread).GetTableInfo("", "", (#.tablename),
"TABLE,VIEW",
// `(#self).mapTable(#.tablename, SQLTables)),
`(#.thread).cbInitStage(), nil);
// Do not start writing data to the table until we have successfully created and
mapped
// the table to a class. If we wanted to start writing data immediately, then
we would
// create the table class beforehand instead of querying the database for the
table
// definition. Then, even if the database were unavailable we could still cache
to the
// local disk until the database was ready.
.thread.AddInitStage(nil, `(#self).startLogging(), nil);
// Set up the callback functions for various events from the database thread
.thread.OnConnectionSucceeded = `(#self).onConnect();
.thread.OnConnectionFailed = `(#self).onConnectFail();
.thread.OnFileSystemError = `princ("File System Error: ", SQLResult, "\n");
.thread.OnODBCError = `princ("ODBC Error: ", SQLResult, "\n");
.thread.OnExecuteStored = nil;
.thread.Start();
// Create a menu item in the system tray that allows us to open a window to
monitor
// the performance of the ODBC thread. The menu strings can be edited as
desired.
.AddCustomSubMenu("ODBC Logging");
.AddCustomMenuItem("Monitor Performance",
`(#.thread).CreateMonitorWindow((#self), "ODBC Monitor"));
// Automatically update the point list every 1 seconds in case new points are v
added
// to the domain.
//.TimerEvery(01, `(#self).registerPoints());
}
/* Any code to be run when the program gets shut down. */
method LogData.destructor ()
{
if (instance_p(.thread))
destroy(.thread);
}
/* Start the program by instantiating the class. */
ApplicationSingleton (LogData);
Major parts of this Gamma script are the constructor,destructor,classes and methods. This program first initializes an ODBC connection using provided details and write each row of data using 'registerpoints' and 'writedata' methods. Please find additional details of each lines from comments in the program.
How can I get data back as an array or generic list from a Firebase database in Unity3D without knowing ahead of time what the name (key) of the children are?
I have been trying out the new Unity Firebase plugin, and I am having an issue figuring out how to get all the children in a specific location, and put the names (the key) and the values into arrays or generic lists so that I can work on the data locally. Forgive me for being so new to Firebase and probably using bad techniques to do this, and this plugin being so new its pretty hard for me to get much outside help, as there are not a lot of docs and tutorials out there on Firebase Unity.
In this particular case I am trying to create "instant messaging" like functionality, without the use of Firebase messaging, and just using regular Firebase database stuff instead. It might have been easier to use Firebase messaging, but mostly for the sake of learning and customization I want to do this on my own with just the Firebase database.
I insert data into the database like this:
public void SendMessage(string toUser, string msg)
{
Debug.Log(String.Format("Attempting to send message from {0} to {1}", username, toUser));
DatabaseReference reference = FirebaseDatabase.DefaultInstance.GetReference("Msgs");
string date = Magnet.M.GetCurrentDate();
// send data to the DB
reference.Child(toUser).Child(username).Child(date).SetValueAsync(msg);
// user receiving message / user sending message > VALUE = "hello dude|20170119111325"
UpdateUsers();
}
And then I try and get it back like this:
public string[] GetConversation(string userA, string userB)
{
// get a conversation between two users
string[] convo = new string[0];
FirebaseDatabase.DefaultInstance.GetReference("Msgs").GetValueAsync().ContinueWith(task =>
{
Debug.Log("Getting Conversation...");
if (task.IsFaulted || task.IsCanceled)
{
Debug.LogError("ERROR: Task error in GetConversation(): " + task.Exception);
}
else if (task.IsCompleted)
{
DataSnapshot snapshot = task.Result;
string[] messagesA = new string[0], messagesB = new string[0];
if(snapshot.HasChild(userA))
{
// userA has a record of a conversation with other users
if(snapshot.Child(userA).HasChild(userB)) // userB has sent messages to userA before
{
Debug.Log("Found childA");
long count = snapshot.Child(userA).Child(userB).ChildrenCount;
messagesA = new string[count];
var kids = snapshot.Child(userA).Child(userB).Children;
Debug.Log(kids);
for (int i = 0; i < count; i++)
{
// this won't work, but is how I would like to access the data
messagesA[i] = kids[i].Value.ToString(); // AGAIN.... will not work...
}
}
}
if(snapshot.HasChild(userB))
{
if(snapshot.Child(userB).HasChild(userA)) // userA sent a message to userB before
{
Debug.Log("Found childB");
long count = snapshot.Child(userB).Child(userA).ChildrenCount;
messagesA = new string[count];
var kids = snapshot.Child(userB).Child(userA).Children;
Debug.Log(kids);
// messy incomplete testing code...
}
}
// HERE I WOULD ASSIGN ALL THE MESSAGES BETWEEN A AND B AS 'convo'...
}
Debug.Log("Done Getting Conversation.");
});
return convo;
}
But obviously this won't work, because DataSnapshot won't let me access it like an array or generic list using indices, and I can't figure out how to treat the data when I don't know the names (the keys) of all the children, and just want to get them out one by one in any order... and since they are named by the date/time they are entered into the DB, I won't know ahead of time what the childrens names (keys) are, and I can't just say "GetChild("20170101010101")" because that number is generated when its sent to the DB from any client.
FYI here is what the DB looks like:
Figured out the answer to your question. Here's my code snippet. Hope this would help!
void InitializeFirebase() {
FirebaseApp app = FirebaseApp.DefaultInstance;
app.SetEditorDatabaseUrl ("https://slol.firebaseio.com/");
FirebaseDatabase.DefaultInstance
.GetReference ("Products").OrderByChild ("category").EqualTo("livingroom")
.ValueChanged += (object sender2, ValueChangedEventArgs e2) => {
if (e2.DatabaseError != null) {
Debug.LogError (e2.DatabaseError.Message);
}
if (e2.Snapshot != null && e2.Snapshot.ChildrenCount > 0) {
foreach (var childSnapshot in e2.Snapshot.Children) {
var name = childSnapshot.Child ("name").Value.ToString ();
text.text = name.ToString();
Debug.Log(name.ToString());
//text.text = childSnapshot.ToString();
}
}
};
}
Firebase developer here.
Have you tried to use Value at the top level Snapshot? It should return to you an IDictionary where the values can also be lists or nested dictionaries. You will have to use some dynamic inspection to figure out what the values are.
I have an array of objects, each of which is assigned an ID when it is first created. I give the user the ability to visually reorder the objects, which changes their position in the array. They then have the option to save that order using a flash sharedObject or "cookie" and then later, if they reopen the flash file, I want them to be able to hit a button to restore that order. I'm just not sure what the syntax would be to set the object's index within the array. Here's my code:
VARIABLES:
var project_settings = SharedObject.getLocal("settings"); //saves all project settings for the next time the file is opened
var project_order:Array = []; //saves project order for the next time the file is opened
var project_display:Array = []; //saves whether each project should be displayed or hidden for the next time the file is opened
SAVE CODE:
function saveOrder(){
for (var i=0;i<project_array.length;i++){
project_order[i] = project_array[i].id;
project_display[i] = project_array[i].projectThumb.thumbActive;
}
project_settings.data.order = project_order;
project_settings.data.active = project_display;
//trace (project_settings.data.active[1]);
project_settings.flush(); //saves most recent "cookie"
}
RESTORE CODE:
function loadOrder(){
for (var i=0;i<project_array.length;i++){
/* NEED THE CODE THAT GOES HERE. BASICALLY, PROJECT_ARRAY[i] SHOULD BE THE ITEM WITH AN ID EQUAL TO PROJECT_SETTINGS.DATA.ORDER[i] */
}
}
Something like this should work:
function loadOrder()
{
var dict = new Dictionary();
for (var i = 0; i < project_array.length; i++)
dict[project_array[i].id] = project_array[i];
project_array = [];
for (var i = 0; i < project_settings.data.order.length; i++)
project_array[i] = dict[project_settings.data.order[i]];
}
Just load in your array and sort on the ID. Something like this should work:
private function _loadArray():void
{
// fill in your array
project_array.sort( this._sortFunc );
}
// replace the * by whatever your object type is
private function _sortFunc( a:*, b:* ):int
{
return a.id - b.id;
}
More info: http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/Array.html#sort()
Or even the sortOn() function (which might be easier) should work:
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/Array.html#sortOn()