How to add initial or default data in App Engine - google-app-engine

Hey guys kind of a n00b in App engine and I have been strugling with this is there a way that I can add/bulk default data to Data Store.
I would like to create catalogs or example data, as well user or permission. I am not using the default App engine user instead I am using webapp2 User auth session base model.
Thanks

You can use the bulkloader: https://developers.google.com/appengine/docs/python/tools/uploadingdata
Or upload data to the blobstore and move it to the datastore.

This is a large topic but, I am using Java code running in task queues to do this.
Much easier to create random test and demo data through code.
Much more friendly to unit testing.
This requires no dependencies. It is just code running and accessing the datastore.
Sometimes easier to manipulate the datastore through code instead of scripts when logic is involved in the changes.
Allows us to upload new task definitions (a Java classes) embedded in a new app version. Then, we trigger the tasks executions by calling a servlet URL. These task classes are then removed from the next app version.
And using tasks, you get around the request execution timeout. If a task is long running, we split it as sequential tasks. When a task completes, it queues the next one automatically.
Of course, this requires a fair amount of coding but is really simple and flexible at the same time.

Related

Script Hosting (API to Database)

I am currently creating a react native + expo application upon which essentially each page makes an API call, which is a lot of API calls. I have this app also connected to firebase for different information. The things is, each of these pages don't update more than once or twice a day for the most part, so I really don't want the End User to be calling the API that much either.
My question is, is there a way to write and host a script that will continuously run that knows to call this API once every hour (or so) and then rewrite to the firebase db from which I can then only need to pull from the database as compared to having each user individually making dozens of API calls.
Please let me know! I have spent days on google and am no closer than I was before. I'm also willing to change my set up from firebase if it is not possible to accomplish that way. Thanks!
You can use a Cloud Functions scheduled trigger to run code periodically that can make changes to your database.

How to programmatically scale up app engine?

I have an application which uses app engine auto scaling. It usually runs 0 instances, except if some authorised users use it.
This application need to run automated voice calls as fast as possible on thousands of people with keypad interactions (no, it's not spam, it's redcall!).
Programmatically speaking, we ask Twilio to initialise calls through its Voice API 5 times/sec and it basically works through webhooks, at least 2, but most of the time 4 hits per call. So GAE need to scale up very quickly and some requests get lost (which is just a hang up on the user side) at the beginning of the trigger, when only one instance is ready.
I would like to know if it is possible to programmatically scale up App Engine (through an API?) before running such triggers in order to be ready when the storm will blast?
I think you may want to give warmup requests a try. As they load your app's code into a new instance before any live requests reach that instance. Thus reducing the time it takes to answer while you GAE instance has scaled down to zero.
The link I have shared with you, includes the PHP7 runtime as I see you are familiar with it.
I would also like to agree with John Hanley, since finding a sweet spot on how many idle instances you have available, would also help the performance of your app.
Finally, the solution was to delegate sending the communication through Cloud Tasks:
https://cloud.google.com/tasks/docs/creating-appengine-tasks
https://github.com/redcall-io/app/blob/master/symfony/src/Communication/Processor/QueueProcessor.php
Tasks can try again hitting the app engine in case of errors, and make the app engine pop new instances when the surge comes.

Load testing a Google App Engine Application using JMeter

I've created an application and I'd like to test how well it scales to large numbers of users.
To run my application a user has to go to the homepage, sign in to a Google account, click a button and then upload a video file.
First of all, is this possible to emulate using JMeter? I'm signed into my Google account locally but am not sure whether simulated users will have access to it?
Secondly, I've recorded a session in JMeter doing the actions above and have run the test with 10 simulated users, however, the App Engine dashboard doesn't detect any activity. I've followed the steps mentioned here but obviously with details of my application etc.
Here's a screenshot of the summary report.
Is there anything obvious I might be doing wrong? Am I using JMeter in the correct way to test the application as desired?
Apologies for my JMeter inexperience.
This is not something you will be able to record and replay, my expectation is that your application is protected by OAuth so you will need some token in order to execute your calls.
Not knowing the details of your application implementation it's quite hard to guess what's went wrong, I would recommend
Running your test with 1 user and 1 loop first to ensure that it's doing what it is supposed to be doing by adding View Results Tree listener and inspecting request and response details for each sampler (especially for failed ones).
Once you figure out what's wrong with this particular request - amend JMeter configuration so it would be successful. Repeat until you're happy with the test end-to-end.
Add load only after that and be careful as test might be sensitive to extra users/loops, especially if you're using a single login account (which is not recommended)
References:
How to Handle Correlation in JMeter
How to Run Performance Tests on OAuth Secured Apps with JMeter

Long-running script on Google App Engine

I'm attempting to create a microservice on Google App Engine that is not intended to handle HTTP requests.
Instead, I was hoping to have a continuously running Python script that monitors a remote queue--RabbitMQ, to be precise--and sends out an api-call to another service as tasks are pushed to the queue.
I was wondering, firstly, is it possible to run a script upon deployment--one that did not originate with a user action/request?
Secondly, how would I accomplish this?
Thanks in advance for your time!
You can deploy your "script" as a manually scaled module -- see https://cloud.google.com/appengine/docs/python/modules/ -- with exactly one instance. As the docs say, "When you start a manual scaling instance, App Engine immediately sends a /_ah/start request to each instance"; so, just set that module's handler for /_ah/start to the handler you want to run (in the module's yaml file and the WSGI app in the Python code, using whatever lightweight framework you like -- webapp2, falcon, flask, bottle, or whatever else... the framework won't be doing much for you in this case save the one-off routing).
Note that the number of free machine hours for manual scaling modules is limited to 8 hours per day (for the smaller, B1 instance class; proportionally fewer for larger instance classes), so you may need to upgrade to paid-app status if you need to run for more than 8 hours.
Like #brant said, App Engine is designed to handle HTTP requests. It's not a perfect fit for background jobs, unless you try to wrap your logic into one http request.
Further, App Engine will emit an error when the response timeout, depending on your scaling settings. If you want to try it, consider basic or manual scaling.
For this type of workload, I would suggest you use a VM.
I think there are a few problems with this design.
First, App Engine is designed to be an HTTP request processor, not a RabbitMQ message processor. GAE is intended for many small requests, not one long-running process.
Second, "RabbitMQ should not be exposed to the public internet, it wasn't created for such use case."
I would recommend that you keep the RabbitMQ clients on the same internal network as the RabbitMQ broker, and have the clients send HTTP requests to App Engine.

Apex only for force.com hosted apps?

Is Apex only permitted on “native” applications that are hosted on force.com?
Or is Apex also available for external applications to hit the “Open APIs” such as REST API and Bulk API?
I think part of my confusion lies in how the term “Rest API” is used in various documents. In other parts of the software world, REST is usually means an HTTP based protocol to exchange data across different domains (and with certain formats, etc ). However, I think Rest API in sales force might SOMETIMES refer to an optional means for native apps to retrieve salesforce data from within force.com. Is that correct?
Not sure I understand your question...
Apex can be used "internally" in:
database triggers,
classes
Visualforce controllers that follow MVC pattern,
logic that parses incoming emails and for example makes Case or Lead records out of them,
asynchronous jobs that can be scheduled to recalculate some important stuff every night
and you can have utility classes for code reuse across these
A "kind of internal" would be to use the "Execute Anonymous" mechanism that lets you fire one-off code snippets against environment. Useful for prototyping of new classes, data fixes etc. You can do it for example in Eclipse IDE or the Developer Console (upper right corner next to your name).
And last but not least - "external" usage.
Apex code can be exposed as webservice and called by PHP, .NET, Java, even JavaScript applications. It's a good choice when:
you want to reuse same piece of logic for example on your own Visualforce page as well as in some mobile application that would be passing couple strings around or a simple JSON object
beats having to reimplement the logic in every new app and maintaining that afterwards
imagine insertion of Account and Contact in one go - your mobile device would have to implement some transaction control and delete the Acc if the Contact failed to load. Messy. And would waste more API calls (insert acc, insert con, ooops, delete acc). With a method exposed as webservice you could accept both parameters into your Apex code, do your magic and well, if it fails - it's all in one transaction so SF will roll it back for you.
There are 2 main methods:
SOAP API primarily uses global methods marked with webservice keyword. Easiest way for other applications to start calling these is to extract from SF and "consume" so-called "enterprise WSDL" file. It's a giant XML file that can be parsed in your .NET app to generate code that will help you write code you're familiar with. These generated classes will construct the XML message for you, send it, process the response (throw your own exceptions if SF has sent an error message) and so on.
Very simple example:
global class MyWebService {
webService static Id makeContact(String lastName, Account a) {
Contact c = new Contact(lastName = 'Weissman', AccountId = a.Id);
insert c;
return c.id;
}
}
REST API allows you to do similar things but you need to use correct HTTP verbs ("PUT" is best for inserts, "PATCH" for updates", "DELETE" and so on).
You can read more about them in the REST API guide: http://www.salesforce.com/us/developer/docs/apexcode/index_Left.htm#CSHID=apex_rest_methods.htm|StartTopic=Content%2Fapex_rest_methods.htm|SkinName=webhelp

Resources