I just made a irc client with as3, but the problem is that i don't have too much experience with SharedObject. I need a best way to update an array when a user join other channels.
For example, if i have this array:
var channels:Array = [];
I also have a function where i can get vars if a user join other channels and when the use do that, array must be updated, without removing vars thats already exist in array, but i fails everytime in this. What's the best way to update array for one item such like for "Channels", it must be looks: "'#channel1', '#channel2', '#channel3'". Sorry for my bad english, i am from the netherlands.
To make a bit more clear:, this is one of in switch() for user commands:
case "JOIN":
if(sCmd.split(" ")[1] == "")
{
sendNoticeToServerAndChatroom("Geef een kanaal op.");
} else {
var getJoinChannel = sCmd.split(" ")[1];
sendCommand("JOIN", [getJoinChannel]);
// Here must be a code that sends every channels to a array <- (channels:array = [];)
goToRoom();
}
break;
Thanks!
Related
I try to use Google Script Apps (instead of VBA which I am more used to) and managed now to create a loop over different spreadsheets (and not only different sheets in one document) using the forEach function.
(I tried with a for (r=1;r=lastRow; r++) but I did not manage).
It is working now defining the array for the sheetnames manually:
var SheetList = ["17DCu1nyyX4a6zCkkT3RfBSfo-ghoc2fXEX8chlVMv5k", "1rRGQHs_JShPSBIGFCdG6AqXM967JFhdlfQ92cf5ISL8", "1pFDyXgYmvC5gnN5AU5xJ8vGiihwtubcbG2n4LPhPACQ", "1mK_X4Q7ysJQTt8NZoZASBE5zuUllPmmWSJsxu5Dnu9Y", "1FpjIGWTG5_6MMYJF72wvoiBRp_Xlt5BDpzvSZKcsU"]
And then for information the loop:
SheetList.forEach(function(r) {
var thisSpreadsheet = SpreadsheetApp.openById(r)
var thisData = thisSpreadsheet.getSheetByName('Actions').getDataRange()
var values = thisData.getValues();
var toWorksheet = targetSpreadsheetID.getSheetByName(targetWorksheetName);
var last = toWorksheet.getLastRow ()+ 1
var toRange = toWorksheet.getRange(last, 1, thisData.getNumRows(), thisData.getNumColumns())
toRange.setValues(values);
})
Now I want to create the definition of the array "automatically" out of the spreadsheet 'List' where all spreadsheets which I want to loop are listed in column C.
I tried several ideas, but always failed.
Most optimistic ones were:
var SheetList = targetSpreadsheetID.getSheetByName('List').getRange(2,3,lastRow-2,3).getValues()
And I also tried with the array-function:
var sheetList=Array.apply(targetSpreadsheetID.getSheetByName('List').getRange(2,3,lastRow-2,3))
but all without success.
It should be possible normally in more or less one single line to import the array from the speadsheet to the Google apps scripts?
I would very much appreciate if someone could please give me a hint where my mistake is.
Thank you very much.
Maria
I still did not manage to put the array as I wanted it initially, but now I found a workable solution with the For Loop which I want to share here in case someone is looking for a similar solution (and then finds at least my workaround ;) )
for (i=2; i<lastRow;i++){
var SheetList = targetSpreadsheetID.getSheetByName('List').getRange(i,3).getValues()
Logger.log(SheetList);
var thisSpreadsheet = SpreadsheetApp.openById(SheetList);
... // the rest identical to loop above...
Don't hesitate to add your comments or advice anyhow, but I will mark the question as closed.
Thanks a lot.
Maria
I've written a CasperJS script that works very well except that it takes a (very very) long time to scrape pages.
In a nutshell, here's the pseudo code:
my functions to scrape the elements
my casper.start() to start the navigation and log in
casper.then() where I loop through an array and store my links
casper.thenOpen() to open each link and call my functions to scrap.
It works perfectly (and fast enough) for scraping a bunch of links. But when it comes to thousands (right now I'm running the script with an array of 100K links), the execution time is endless: the first 10K links have been scrapped in 3h54m10s and the following 10K in 2h18m27s.
I can explain a little bit the difference between the two 10K batches : the first includes the looping & storage of the array with the 100K links. From this point, the scripts only open pages to scrap them. However, I noticed the array was ready to go after roughly 30 minutes so it doesn't explain exactly the time gap.
I've placed my casper.thenOpen() in the for loop hoping that after each new link built and stored in the array, the scrapping will happen. Now, I'm sure I've failed this but will it change anything in terms of performance ?
That's the only lead I have in mind right now and I'd be very thankful if anyone is willing to share his/her best practices to reduce significantly the running time of the script's execution (shouldn't be hard!).
EDIT #1
Here's my code below:
var casper = require('casper').create();
var fs = require('fs');
// This array maintains a list of links to each HOL profile
// Example of a valid URL: https://myurl.com/list/74832
var root = 'https://myurl.com/list/';
var end = 0;
var limit = 100000;
var scrapedRows = [];
// Returns the selector element property if the selector exists but otherwise returns defaultValue
function querySelectorGet(selector, property, defaultValue) {
var item = document.querySelector(selector);
item = item ? item[property] : defaultValue;
return item;
}
// Scraping function
function scrapDetails(querySelectorGet) {
var info1 = querySelectorGet("div.classA h1", 'innerHTML', 'N/A').trim()
var info2 = querySelectorGet("a.classB span", 'innerHTML', 'N/A').trim()
var info3 = querySelectorGet("a.classC span", 'innerHTML', 'N/A').trim()
//For scraping different texts of the same kind (i.e: comments from users)
var commentsTags = document.querySelectorAll('div.classComments');
var comments = Array.prototype.map.call(commentsTags, function(e) {
return e.innerText;
})
// Return all the rest of the information as a JSON string
return {
info1: info1,
info2: info2,
info3: info3,
// There is no fixed number of comments & answers so we join them with a semicolon
comments : comments.join(' ; ')
};
}
casper.start('http://myurl.com/login', function() {
this.sendKeys('#username', 'username', {keepFocus: true});
this.sendKeys('#password', 'password', {keepFocus: true});
this.sendKeys('#password', casper.page.event.key.Enter, {keepFocus: true});
// Logged In
this.wait(3000,function(){
//Verify connection by printing welcome page's title
this.echo( 'Opened main site titled: ' + this.getTitle());
});
});
casper.then( function() {
//Quick summary
this.echo('# of links : ' + limit);
this.echo('scraping links ...')
for (var i = 0; i < limit; i++) {
// Building the urls to visit
var link = root + end;
// Visiting pages...
casper.thenOpen(link).then(function() {
// We pass the querySelectorGet method to use it within the webpage context
var row = this.evaluate(scrapDetails, querySelectorGet);
scrapedRows.push(row);
// Stats display
this.echo('Scraped row ' + scrapedRows.length + ' of ' + limit);
});
end++;
}
});
casper.then(function() {
fs.write('infos.json', JSON.stringify(scrapedRows), 'w')
});
casper.run( function() {
casper.exit();
});
At this point I probably have more questions than answers but let's try.
Is there a particular reason why you're using CasperJS and not Curl for example ? I can understand the need for CasperJS if you are going to scrape a site that uses Javascript for example. Or you want to take screenshots. Otherwise I would probably use Curl along with a scripting language like PHP or Python and take advantage of the built-in DOM parsing functions.
And you can of course use dedicated scraping tools like Scrapy. There are quite a few tools available.
Then the 'obvious' question: do you really need to have arrays that large ? What you are trying to achieve is not clear, I am assuming you will want to store the extracted links to a database or something. Isn't it possible to split the process in small batches ?
One thing that should help is to allocate sufficient memory by declaring a fixed-size array ie:
var theArray = new Array(1000);
Resizing the array constantly is bound to cause performance issues. Every time new items are added to the array, expensive memory allocation operations must take place in the background, and are repeated as the loop is being run.
Since you are not showing any code, so we cannot suggest meaningful improvements, just generalities.
I have created many, let's say three ObjectStores inside a defined and versioned IndexedDB schema.
I need to populate all of them. To do so, I created an object which stores both name end endpoint (where it gets data to populate).
Also to avoid error when trying to fill objectstores already populated, I use the count() method to ... count key inside the objectstore and if there are 0 key, then populates, else reads.
It works perfect if I execute on a one by one basis, that is instead of using a loop, declare and execute each one of the three objectstores.
However when invoking the function to populate each storage inside a loop, I get the following error message for the last two objectstores to be populated:
Failed to read the 'result' property from 'IDBRequest': The request
has not finished. at IDBRequest.counts.(anonymous function).onsuccess
Here is the code:
// object contains oject stores and endpoints.
const stores = [
{osName:'user-1', osEndPoint:'/api/1,
{osName:'user-2', osEndPoint:'/api/2},
{osName:'user-3', osEndPoint:'/api/3}
];
// open db.
var request = indexedDB.open(DB_NAME, DB_VERSION);
// in order to dynamically create vars, instantiate two arrays.
var tx = [];
var counts = [];
var total = [];
// onsuccess callback.
request.onsuccess = function (e) {
db = this.result;
for(k in stores) {
tx[k] = db.transaction(stores[k].osName).objectStore(stores[k].osName);
counts[k] = tx[i].count();
counts[k].onsuccess = function(e) {
total[k] = e.target.result;
// if the counting result equals 0, then populate by calling a function that does so.
if (total[k] == 0) {
fetchGet2(stores[k].osEndPoint, popTable, stores[k].osName); //
} else {
readData(DB_NAME, DB_VERSION, stores[0].osName);
}
};
} // closes for loop
}; // closes request.onsuccess.
The fetchGet2 function works well inside a loop, for example the loop used to create the objectstores, and also has been tested on a one by one basis.
It looks like an async issue, however I cannot figure how to fix the problem which is to be able to populate existing objectstores dynamically, avoiding to populate filled objectsores and only filling empty ones.
Indeed testing without the count issue, but inside the loop works perfect, or with the count but without loop.
At the moment and when logging counts[k], It only logs data for the last member of the object.
Thanks in advance, I'm coding with vanilla js, and I'm not interested in using any framework at all.
Yes, this looks like an issue with async. For loops iterate synchronously. Try writing a loop that does not advance i until each request completes.
for example
var imageViewArray:UIImageView = [imageView1,imageView2,imageView3]
I want to chage sameimageView.image = img or imageView.isUserInteractionEnabled = false to all Image View inside the array
1. It's an array
First of all it's not
var imageViewArray:UIImageView
but
var imageViewArray:[UIImageView]
because you want an array of UIImageView right?
2. Naming conventions
Secondly is Swift we don't name a variable after it's type so imageViewArray becomes imageViews.
3. map
Now if you really hate the for in and the foreach your can write
imageViews = imageViews.map { imageView in
imageView.isUserInteractionEnabled = true
return imageView
}
or as suggested by Bohdan Ivanov in the comments
imageViews.map { $0.isUserInteractionEnabled = true }
4. Wrap up
This answer shows you how to use the wrong construct (map) to do something that should be made with the right construct (for in).
That's the point of having several constructs, everything could be made with an IF THEN and a GOTO. But a good code uses the construct that best fits that specific scenario.
So, the best solution for this scenario is absolutely the for in or the for each
imageViews.forEach { $0.isUserInteractionEnabled = true }
i have an array.
At some stage, i'm adding more data to it.
So we have:
$editable = someArrayGeneratingFunctionHere();
$points = preg_split('/,/',$this->data['Video']['points']);
lovely. Now, the "points" array has a bunch of data that may or may not already be in the editable array.
What i want is to check if the data is in editable, and add if not.
I'd like to do this efficiently too.
So, i have this method:
private function associateWithRelatedBodyParts($editable, $keysAlreadyPresent, ...){
$point = getOtherPointsThatAreRelatedToThisPoint();
if (!isset($keysAlreadyPresent[$point])){
insertDataIntoEditable();
} //else the value is already here. Do not add it again!
return $editable;
}
so the whole thing looks like this:
$editable = someArrayGeneratingFunctionHere();
$points = preg_split('/,/',$this->data['Video']['points']);
$valuesInEditable = ...
foreach ($points as $point){
$editable = $this->associateWithRelatedBodyParts($editable, $valuesInEditable,...);
}
What a lot of setup! The whole point of all this is thus:
i want to flatten the original $editable array, because that way, i can quickly test if a point is in the editable. If it is not, i'll add it.
My current way to retrive the valuesInEditable array is
$valuesInEditable = Set::combine($editable, 'BodyPartsVideo.{n}.body_part_id','BodyPartsVideo.{n}.body_part_id');
This is moronic. I'm sticking the same value twice into the array. What i'd really like is just:
$valuesInEditable = Set::combine($editable, 'BodyPartsVideo.{n}.body_part_id',True);
or something like that. So the whole point of this question is:
how do i set a default value using Set:combine in cakephp. If you have a better suggestion, i'd love to hear it.
Without seeing the structure of your array, it seems like you are going through a lot of work to combine the two arrays. Is there a reason you cannot use
$final_array = array_merge($points, $editable);
before running the set combine?