I have a GCP Cloud Function publishing events into an output PubSub topic. If I have the output topic pre-created - all is fine.
However, I'd like to have the Cloud Function to auto create the topic if it does not already exist. Is it possible?
I have this code for publishing:
publisher = pubsub_v1.PublisherClient()
data = json.dumps(event).encode("utf-8")
topic_path = publisher.topic_path(project, topic)
publisher.publish(topic_path, data)
I know I could add a call to create the topic first:
topic = publisher.create_topic(request={"name": topic_path})
and catch an exception if it tires to create an existing topic
but this sounds more like a dirty hack to me, not to mention unnecessary call to the publisher API and wasted cycles handling the call and exceptions on every CF invocation ....
Thank you!
Related
I've created a Logic App which has a HTTP Post trigger containing a JSON payload with the name of the topic as a property. I have a subsequent step to create the topic and subscription but it fails:
with the message '
Service Bus messaging entity 'Topic1' or namespace '.servicebus.windows.net' not found.
I'm confused: of course the topic wasn't found - i want to create it ! 'Topic1' btw is the property that's being passed in the http request which is being correctly evaluated. So what am i doing wrong ?
The action is "Create a topic subscription", not "Create a topic". There doesn't appear to be an action for creating a topic, and creating a subscription to a topic assumes the topic exists. While I've never used it this way, topic subscriptions can be ephemeral and relate to one particular instance of a business process, to be destroyed when this is completed - but topics themselves are intended to be longer-lived and part of your application architecture, not ephemeral objects. So this seems logical. You might consider using the Azure management REST API from the logic app using the HTTP action, otherwise create the topic through the portal or Azure cli.
We use Spring Cloud Stream(version 3.0.7) StreamListener to consume from Google cloud PubSub subscription 'A.SUB' from topic 'A'.
We have a requirement to pause consumption from PubSub, I see below options in the order of preference, I don't have exact idea on how to achieve options 1 and 2. Can someone please share thoughts on these?
Add another Pubsub Topic 'B' and publish 'Pause' event message to pause or 'Resume' event message to resume, somehow stop/start poller on subscription 'A.SUB' on seeing 'pause/resume'
Is there any way to achieve this?
Pause the subscription based on time window say between 12AM to 6AM? is there a way to specify some CRON expression?
Consume messages from 'A.SUB' and send nack between 12AM and 6AM
#StreamListener("A.SUB")
public void consume(Message message) { }
Note: StreamListener and the entire annotation-based configuration model has been deprecated. We've fully migrated to functional programming model which is much simpler.
With regard to pausing, you can accomplish it with actuator and binding endpoints (e.g., stop, start, pause, resume) exposed by s-c-stream. You can get more info here.
I am looking for a way to schedule Cloud Functions for Firebase or in other words trigger them on a specific time.
Update 2019-04-18
There is now a very simple way to deploy scheduled code on Cloud Functions through Firebase.
You can either use a simple text syntax:
export scheduledFunctionPlainEnglish =
functions.pubsub.schedule('every 5 minutes').onRun((context) => {
console.log('This will be run every 5 minutes!');
})
Or the more flexible cron table format:
export scheduledFunctionCrontab =
functions.pubsub.schedule('5 11 * * *').onRun((context) => {
console.log('This will be run every day at 11:05 AM UTC!');
});
To learn more about this, see:
The Scheduling Cloud Functions for Firebase blog post introducing the feature.
The documentation on scheduled functions.
Note that your project needs to be on a Blaze plan for this to work, so I'm leaving the alternative options below for reference.
If you want to schedule a single invocation of a Cloud Function on a delay from within the execution of another trigger, you can use Cloud Tasks to set that up. Read this article for an extended example of how that can work.
Original answer below...
There is no built-in runat/cron type trigger yet.
For the moment, the best option is to use an external service to trigger a HTTP function periodically. See this sample in the functions-samples repo for more information. Or use the recently introduced Google Cloud Scheduler to trigger Cloud Functions through PubSub or HTTPS:
I also highly recommend reading this post on the Firebase blog: How to Schedule (Cron) Jobs with Cloud Functions for Firebase and this video: Timing Cloud Functions for Firebase using an HTTP Trigger and Cron.
That last link uses cron-job.org to trigger Cloud Functions, and works for projects that are on a free plan. Note that this allows anyone to call your function without authorization, so you may want to include some abuse protection mechanism in the code itself.
What you can do, is spin up an AppEngine instance that is triggered by cron job and emits to PubSub. I wrote a blog post specifically on that, you might want to take a look:
https://mhaligowski.github.io/blog/2017/05/25/scheduled-cloud-function-execution.html
It is important to first note that the default timezone your functions will execute on is America/Los_Angeles according to the documentation. You may find a list of timezones here if you'd like to trigger your function(s) on a different timezone.
NB!!: Here's a useful website to assist with cron table formats (I found it pretty useful)
Here's how you'd go about it:
(Assuming you'd like to use Africa/Johannesburg as your timezone)
export const executeFunction = functions.pubsub.schedule("10 23 * * *")
.timeZone('Africa/Johannesburg').onRun(() => {
console.log("successfully executed at 23:10 Johannesburg Time!!");
});
Otherwise if you'd rather stick to the default:
export const executeFunction = functions.pubsub.schedule("10 23 * * *")
.onRun(() => {
console.log("successfully executed at 23:10 Los Angeles Time!!");
});
I have been struggling with a problem in Google App Engine, using Java, for several days.
Many times (about 50% of the time) when I try to request the connection to a Cloud Sql instance, the connection returns a null value, resulting in several NullPointerException messages when trying to invoke Cloud Sql queries (when invoking .prepareCall(stored_proc)).
I have the latest App Engine Java SDK, in a project service, shared with other services built in Python which consume this Java backend.
Could it be possible that after certain time the instance/s could crash (I am just testing at this point, so I am using default scalation)?
This is the code that returns null:
Class.forName("com.mysql.jdbc.GoogleDriver");
url = "jdbc:google:mysql://project:instance/database?user=root";
log.info(url);
return DriverManager.getConnection(url);
This is part of my configuration file:
<application>app</application>
<module>mod</module>
<version>1</version>
<threadsafe>true</threadsafe>
<use-google-connector-j>true</use-google-connector-j>
I tried several suggestions from other posts, but with no success at all.
Any suggestion will be welcome, thanks in advance.
I was facing the same problem while using Google Cloud SQL and App engine.
I solved the problem by managing the connection pool my self. I realised that when you request a new connection for each request and close it on completion of the thread. The other requests would get back a null resulting to NullPointException.
I decided to do the following and it work for me for like 2 years now.
Open a connection and keep it to a static class that has a number of connections;
Every time i want to find a connection to the database, i would first check if the is an available connection for me to use.
Incase a Query killed the connection, thus means i needed to request another extra connection just to for the sack of connection drops.
I will add this as an answer, since it is not exactly what Chrispinus mentioned, although he gave me a good idea for teh solution.
I went deeper in the code and found that some of the methods were not closing the database connection. I had assumed all of them were doing that, but looking at each method, I found I was wrong.
So, although it sounds obvious, check connections are being closed (or managed, as Chrispinus says) properly.
We have been using push queue for a very long time and have no problems in consuming the tasks from a dev server.
However during implementing a new service with pull queue, it became difficult to figure out how to do the same thing on the dev server.
Basically from the docs, what we can see is that you should use a REST api (we can't use the direct queue api as it is consumed by an external app) to lease/delete a task with the end point of
https://www.googleapis.com/taskqueue/v1beta1/projects/taskqueues
But obviously this will not work in local dev server, and it appears that no place have talking about this.
Just wondering if anyone had ever run into the same issue had can shed some light?
With Pull Queue, task consumer can be internal or external.
If you need it to work on dev server, then just create a handler (a servlet) and use internal API to add, lease and delete tasks.