I have built Salesforce platform event client using CometD java and It's similar to example EMP-Connector provided by forcedotcom.
I installed this client on OpenShift cloud and my app is running with 2 pods. Problem I am facing is that since there are two pods and each are running the same docker image thus both are getting the same of event. That means duplication event.
Per my understanding, Salesforce platform event should behave like Kafka subscriber.
I am unable to find a solution about how to avoid getting the same copy of events. Any suggestions here would be the great help.
Note: As of now I am able to create client side solution which drop duplicate copy of event. which is not an optimal solution.
I have to run my app atleast with 2 pods. That's limit I have on my cloud.
This is expected / by design. In CometD, when a message is published on a broadcast channel all subscribers listening on this channel will receive a copy of this message. The broadcast channel behaves like a messaging topic where one sender wants to send the same info to multiple recipients. There are other types of channels in CometD with different semantics. The broadcast channel and one-to-many message semantics is what you get with platform events available via CometD in Salesforce.
In your case it sounds like you have multiple subscribers, thus what you're seeing is expected. You can deduplicate the message stream on the client side as you have done or you can change your architecture so that you have a single subscription.
Related
I want to implement a connected OAuth app in Salesforce which should trigger push events in case some entities changed, for example an opportunity was closed.
Zapier implemented something similar
https://zapier.com/apps/salesforce/integrations/webhook
Could not find something I need which is a simple way to subscribe to entity changes using the OAuth client's token and passing a webhook endpoint. I read about apex callouts, streaming API and outbound messages.
Yeah, we solved this exact problem at Fusebit and I can help you understand the process as well.
Typically speaking here's what you need to do:
Create triggers on the Salesforce Objects you want to get updates for
Upload Apex class that will send an outgoing message to a pre-determined URL
Enable Remote Site Setting for the Domain you want to send the message to
Add in Secret Verification (or other auth method) to prevent spamming of your external URL
If you're leveraging javascript, then you can use the jsforce sdk & salesforce tooling API to push the code into the salesforce instance AFTER the Auth flow has occurred AND on Salesforce Instances that have API access enabled (typically - this is enterprise and above OR professional with API enabled).
This will be helpful for you to look through: https://jamesward.com/2014/06/30/create-webhooks-on-salesforce-com/
FYI - Zapier's webhooks implementation is actually polling every 15 minutes, instead of real-time incoming events.
In which programming language?
For consuming outbound messages you just need to be able to accept an XML message and send back "Ack" message to acknowledge receiving, otherwise SF will keep trying to resend it for 24h.
For consuming platform events / streaming API / Change Data Capture (CDC) you'll need to raise the event in SF (Platform Event you could raise from code, flow, process builder, CDC would happen automatically, you just tell it which objects it should track).
And then in client app you'd need to login to SF (SOAP or REST API), subscribe to channel (any library that supports cometd should be fine). Have you seen "EMP Connector", mentioned for example in https://trailhead.salesforce.com/en/content/learn/modules/change-data-capture/subscribe-to-events?trail_id=architect-solutions-with-the-right-api ?
Picking right messaging way is an art, there's free course that can help: https://trailhead.salesforce.com/en/content/learn/trails/architect-solutions-with-the-right-api
And pretty awesome PDF if you want to study for certification: https://resources.docs.salesforce.com/sfdc/pdf/integration_patterns_and_practices.pdf
I have multiple microservices that would need to be notified of certain events. All these microservices are running on Cloud Run. Given that Cloud Run can scale down to 0 instances, I’ve assumed that the Push model is better for our situation.
The problem arises with needing to get the notification to multiple endpoints. For example, we have an OrderService, an InventoryService, and PaymentService. The latter two both need to listen for the “OrderPlaced” event. The question is, how can we push this message to both services without having to explicitly call it (e.g. with Task Queues) or creating dependencies within the OrderService? Is there a “Google Cloud” way of solving this issue?
I’ve thought about creating “pull” listeners but the problem is that Cloud Run instances can scale to 0, right, which would effectively not receive events?
When a message is published to a topic, a subscriber is made aware of that message and can be used to push a copy of the message to an application that wishes to process it. If we have multiple applications that wish to be notified when a message is published, we can create multiple subscriptions to the same topic. Each subscription will independently cause a copy of the published message to be pushed to its own registered application in parallel.
I have a micro-service which subscribes to a topic in WebSphere MQ. The subscription is managed and durable. I explicitly set the subscription name, so that it can be used to connect back to the queue, after recovering from any micro service failure. The subscription works as expected.
But I might have to scale up the micro service and run multiple instances. In this case I will end up with having multiple consumers to the same topic. But here it fails with error 2429 : MQRC_SUBSCRIPTION_IN_USE. I am not able to run more than one consumer to the topic subscription. Note : A message should be sent only to one of the consumers.
Any thought ?
IBM Websphere Version : 7.5
I use the C-client API to connect to the MQ.
When using a subscriber what you describe is only supported via the IBM MQ Classes for JMS API. In v7.0 and later you can use Cloned subscriptions (this is a IBM extension to the JMS spec), in addition in MQ v8.0 and later you can alternately use Shared subscriptions which is part of the JMS 2.0 spec. With these two options, multiple subscribers can be connected to the same subscription and only one of them will receive each published message.
UPDATE 20170710
According to this APAR IV96489: XMS.NET DOESN'T ALLOW SHARED SUBSCRIPTIONS EVEN WHEN CLONESUP PROPERTY IS ENABLED, XMS.NET is also supposed to support Cloned subscriptions but due to a defect this is will not be supported until 8.0.0.8 or 9.0.0.2 or if you request the IFIX for the APAR above.
You can accomplish something similar with other APIs like C by converting your micro-service to get from a queue instead of subscribing to a topic.
To get the published messages to the queue you have two options:
Setup a administrative subscription on the queue manager. You can do this a few different ways. The example below would be using a MQSC command.
DEFINE SUB('XYZ') TOPICSTR('SOME/TOPIC') DEST(SOME.QUEUE)
Create a utility app that can open a queue and create a durable subscription with that provided queue, the only purpose of this app would be to subscribe and unsubscribe a provided queue, it would not be used to consume any of the published messages.
Using the above method, each published message can only be read (GET) from the queue by one process or thread.
I have data in Salesforce and run another application that works with the same data. The current workflow is that when data is entered into the custom application, it sends the information to Salesforce via SOAP. I want to establish the reverse link; when a value is changed on the Salesforce side, I want Salesforce to ping my application with the changes. Does Salesforce have a feature to do this? Something equivalent to a trigger maybe?
My current solution is mindless iteration through all Salesforce records. This is slow, hits the API limit often, and keeps data stale too long.
You can do this using Streaming API
Introduction:
Use Streaming API to receive notifications for changes to Salesforce data.
Use to push relevant data in realtime, instead of having to refresh the screen to get new information. Protocols Use for Connection:
The Bayeux protocol and CometD both use long polling.
Bayeux is a protocol for transporting asynchronous messages, primarily over HTTP.
CometD is a scalable HTTP-based event routing bus that uses an AJAX push technology pattern known as Comet. It implements the Bayeux
protocol. The Salesforce servers use version 2.0 of CometD.
How it Works:
Create a PushTopic based on a SOQL query. This defines the channel. (PushTopic is a standard object).
Clients subscribe to the channel.
A record is created, updated, deleted, or undeleted (an event occurs). The changes to that record are evaluated.
If the record changes match the criteria of the PushTopic query, a notification is generated by the server and received by the subscribed
clients.
Please check this link : http://www.salesforce.com/developer/docs/api_streaming/
I created a working Google Channel AP and now I would like to send a message to all clients.
I have two servlets. The first creates the channel and tells the clients the userid and token. The second one is called by an http post and should send the message.
To send a message to a client, I use:
channelService.sendMessage(new ChannelMessage(channelUserId, "This is a server message!"));
This sends the message just to one client. How could I send this to all?
Have I to store every Id which I use to create a channel and send the message for every id? How could I pass the Ids to the second servlet?
Using Channel API it is not possible to create one channel and then having many subscribers to it. The server creates a unique channel for individual JavaScript clients, so if you have the same Client ID the messages will be received only by one.
If you want to send the same message to multiple clients, in short, you will have to keep a track of active clients and send the same message to all of them.
If that approach sounds scary and messy, consider using PubNub for your push notification messages, where you can easily create one channel and have many subscribers. To make it run on Google App Engine is not that hard, since they support almost any platform or device.
I know this is an old question, but I just finished an open source project that uses the Channel API to implement a publish/subscribe model, i.e. you can have multiple users subscribe to a single topic, and then all those subscribers will be notified when anyone publishes a message to the topic. It also has some nice features like automatic message persistence if desired, and "return receipts", where a subscriber can be notified whenever OTHER subscribers receive that message. See https://github.com/adevine/gaewebpubsub#gae-web-pubsub. Licensed under Apache 2.0 license.