We have a formula field against contact records in Salesforce that needs to be different based on whether the record is being viewed in the Sales Cloud or the Service Cloud Console.
Is there a way to detect whether you are running under Sales Cloud or Service Cloud so the formula can be substituted?
I'm quite sure you cannot do this using a formula field.
You can do this in custom code/controller by checking the URL parameter named isdtp , if you are in console it will be set to 'vw' or 'nv'.
Hope this helps.
Related
I have Scim provisioning setup and connected to azure ad using a custom application which isn't in the marketplace. Provisioning new users and changing data on existing appears to work fine. But when I delete data from a previously synched field, I'm not seeing any change to remove this data in the scim application.
I've tried all number of combinations and checking out documentation for this as a known issue, but have come up short. Does anyone know why this doesn't work?
Mapping
Data deleted from provisioned user
Provisioning User on demand doesn't show any changes
Thank You user3269662 for sharing the right document it will help other member who is looking for the same, currently AAD Provisioning doesn't send null/empty values in almost all cases that is the reason empty/null value of phone number is not reflecting in SCIM application.You can update the value of phone number but can not pass null value, as Microsoft found this is special type of consideration and they are working on that. For knowing about progress on this you can comment on Same MS Q&A post.
WorkAround : You need to manually delete the attribute value from SCIM application if you set null value for any attribute of AAD
I was going through the integration documents available for snowflake & service now. But, all documents are oddly focussed on sf consuming snow data for analytics. Didn't find anything related to creating tickets for failures at snowflake. Is it possible?
It's not about the monitoring & notification aspect of snowflake but connecting with service now and raise a ticket for query failures (tasks,sp etc.)
Any ideas?
There's no functionality like that as of now. I can recommend you open an Idea for it and if enough customers want it our Product Management will review it.
For the Snowpipe, we found a way to use it. We send the error message to SNS and then we can do a Lambda function to call the Rest API of ServiceNow to create a ticket.
For Task, we find that it is possible to use External Functions to notify to AWS whenever the Task fails, but we haven’t implemented it.
Email is a simple way. You need to determine how your ServiceNow instance is processing emails. We implemented incident creation from Azure App Insights based on emails.
In ServiceNow find the Inbound Action you need to process the email or make one.
ServiceNow provides every instance with an email account
Refer to enter link description here
The instance email is usually xxxx#service-now.com.
If your instance url is "audi.service-now.com", the email would be "audi#service-now.com".
For a PDI dev#servicenowdevelopers.com, e.g.; dev12345#servicenowdevelopers.com
My company owns several (verified) facilities and using my company's email i can see those locations (business.google.com).
Now, my company would like to fetch the reviews in each location and present it in our company website. Before we're using the Google Place API but since it only returns the latest 5 reviews we opt to using Google My Business API to retrieve a location's complete reviews. We'd like our backend (PHP) to retrieve the reviews so using the same email I created a service account (console.developers.google.com/apis/credentials) because we don't need the end user to allow/interact anything when browsing our website.
Using postman (with my signed JWT) I have managed to get a valid access token
...that I use to retrieve the lists of accounts (mybusinessaccountmanagement.googleapis.com/v1/accounts) I could see the service account itself alone in the response.
Now, I tried calling the account locations api (mybusiness.googleapis.com/v4/accounts/{MY_ACCOUNT_ID_HERE}/locations) but it only returns and empty object response.
Can someone help me resolve this issue. Why my service account can't see the verified locations under my company's email. Is this even possible? Thank you.
Even that this is an older question - I run into the same issue calling the new Google My Business Information v1 API (getting empty results) using a service account.
It seems, that it is not recommended to use Service Accounts, I found this support article on Google: https://support.google.com/business/thread/8281160/cannot-get-access-to-gmb-locations-with-service-account-with-nodejs?hl=en
The "official" recommendation is to use OAuth.
But we finally made it using Service Account. The following steps are necessary to resolve it (at least for us it is working now):
Add a project in Google Cloud Platform
Add and enable the Account Management and Business Information API's.
Add the service account and generate a key (https://developers.google.com/identity/protocols/oauth2/service-account#creatinganaccount)
Make the Business Profile API request (you need the approval made by Google to be able to make requests against the two API's; otherwise you may run into quota exceeds as "Request per minute" is set to 0 by default). Important: It may take up to 2 weeks until, but we received the approval within about 5 days
Enable domain-wide delegation for the service account using the scope "https://www.googleapis.com/auth/business.manage". More about domain-wide-delegation: https://developers.google.com/identity/protocols/oauth2/service-account#delegatingauthority)
Add a user identity in GCP. This user also needs to be added in Google My Business for editing locations. When creating your ServiceAccountCredential object, impersonate this user.
Security concerns:
Domain-wide-delegation enables that everyone knowing/having the credentials of the service account could impersonate any person (identity) from withing GCP. At least in this case only for Business Profile API, but anyway, keep this in mind.
Also using private keys for authenticating the service account is not recommended, you should be aware to regularly change / create a new private key or there would be a solution with Identity Workload.
Hope this helps everyone facing the challenge with GMB / GCP / service accounts :-)
I created a report in DataStudio and embedded it on my website. I activated the option "anyone with the link can view" so this report will be visible to my website users.
But I need to show my website users different data depending on their user ids and more important I don't want users would be able to see other users' data so if I used URL filtering users would be able to breach and search another user id to see his data.
Does anyone have a solution for this scenario?
In Google documentation I saw an option to limit the report to users in my domain, I assume this will solve this issue, but I don't find how to restrict other domains.
Users are logged onto Google
If users of your website are already logged onto Google, use the Filter by email address guide from Data Studio help center. This requires you to setup FILTER BY EMAIL and then have a field in your data can be directly used as an email filter.
Users are not logged on to Google
If you want a solution where the users don't have to be logged onto Google, you will need to:
Create a Community Connector to pass the filtered data to your users. The connector should accept a short lived token as part of the config.
Create a dashboard with your connector and pass unique short-lived tokens for each user.
You should have an endpoint that returns the current user's data based on the token provided. Alternatively, the endpoint can return only the user's identify and you can query a secondary data source with a service account filtering for the user's identity.
Your connector should call your endpoint to fetch data only for the user/for the user's identity.
This official guide demonstrates how to implement this in more details.
Disclaimer: I work in the Data Studio team and wrote the above guide.
First option is to add extra 2 fields to your data source.
User_ID
Password
For example:
Data, User_ID, Password
$10,Daniel,123
$20,Alex,456
In your dashboard, you need to create two parameters:
User_ID_Parameter
Password_Parameter
Both parameters can set the default value to null, and accepts any values.
Then create a new calculated field:
CASE
WHEN REGEXP_MATCH(User_ID,USER_ID_Parameter) AND REGEXP_MATCH(Password,Password_Parameter) THEN 1
ELSE 0
END
Then create a new filter to the chart that you want to hide:
To include the above calculated field Equal to 1
Second option is to use the Data Studio default Row Level Security
The only caveat is the users need to sign in before they can view the report.
I'm using the Az modules in Powershell. I want to get the applications for my subscription ID.
Get-Context shows the SubscriptionID I want.
Get-AzAdApplication returns all of the applications for my company, 600K+ of them.
There doesn't seem to be any effective filtering on Get-AzAdApplication. The predecessor was Get-AzureAdApplication or Get-AzureRmAdApplication.
My goal is to scan the App registrations and validate the expiration of each of the service principals.
I need the ApplicationID to do that, but I can't seem to limit the query except by number of items returned. I don't really want to crawl through 600K applications.
Any idea if there is a way to proceed?
There is no relation between the subscription and your applications. You should query all applications in your tenant and get the application id and then after use Get-AzADAppCredential -ObjectId 35157fe1-6ce8-47f6-9ea8-4d23afd4381d command. You can also use MS Graph to get the expiration date, appid using the below serviceprincipal endpoint.
https://graph.microsoft.com/v1.0/serviceprincipals?$select=appid,keyCredentials