SignInActivity attribute stopped working? - azure-active-directory

I have an automated workbook to keep our Azure environment clean from stale Guest account.
For this I use the SigninActivity attribute in Azure, through Graph.
This has worked fine for about a month, but suddenly, last week, it stopped working, and when I as for the specific URI, I do not get any results in return. If I remove the specific SignInActiviy attribute, the query works fine and returns all Guest users.
But I need the SingInActivity data for my workbook to work.
Anyone have the same issue, or have any update?

Sorry for delayed response.
Sharing my findings in case if anyone come across similar issue.
I was able to get Signinactivity by using https://graph.microsoft.com/beta/users?$select=displayName,userPrincipalName,signInActivity
Returned only on $select. Supports $filter (eq, ne, NOT, ge, le) but, not with any other filterable properties. Note: Details for this property require an Azure AD Premium P1/P2 license and the AuditLog.Read.All permission.
Note: There's a known issue with retrieving this property.
References:
signInActivity resource type &
user resource type

Related

AAD "manager" attribute do not sync from AD

I have an Hybrid AD and AAD enviroment. We use AzureAD to provision users and attributes to one of our SaaS solutions. One for those is the "manager" attribute. The problem is that once the manager value has been set once, it does not update. And since the users are synced from onprem, the manager field is in "read only"-mode and there is no way for me to update it.
When i change the value on one of my users and run a sync to AzureAD i can see that there are 1 change to the export attribute flow and Updates to the Delta Import. But in AAD the old value persists.
I ran the Sync troubleshooter and checked for sync issues in AAD, but there are no indicated errors. Everything else seems to work and sync as intended. Anyone know if there is anything special with "manager"? Any help would be appritiated.

Azure AD provisioning requires two runs to succeed with custom app

We've created an application using SCIM 2 SDK from PingIdentity for provisioning with Azure AD. Custom mapping is set up and working.
However, when the user is CREATED, all of the fields are included in the import, but only a few fields are included in the provisioning step and sent to our application. Provisioning needs to run a second time on that user to UPDATE in order for all the fields to be included. Amongst other things, this means that first and last name are not split and it only sends the displayname (which ends up as firstname on our end).
For some users in normal provisioning, it can take days between the create and update runs so we're missing data for a long time.
Anyone know how can we can test for what's causing this and solve it so all the fields are included in the initial CREATE run for a user?
Here are the attribute mapping settings: https://imgur.com/ypfAAmD
And an example log of when the user is created with only basic fields: https://imgur.com/iOXACJh
vs. when the user is updated with all the other fields: https://imgur.com/UqDNyCv
I'm a product manager at Microsoft that works on the provisioning service and our SCIM client.
The behavior you're seeing occurs when you have attributes that are not part of the SCIM core schema included as "short" names. Attributes not defined in the SCIM core schema (RFC 7643) should have full URN syntax. Something to the effect of urn:ietf:params:scim:schemas:extension:appName:2.0:User:attributeName is commonly used by other implementations. The shaky behavior you're seeing where the AAD provisioning service fails to send these attribute values via a POST but later includes them in a PATCH comes down to different code paths in the AAD provisioning service, and the PATCH code happens to handle this differently than the POST code. This is purely by chance, however, and isn't an intentional design choice. At some future point I'm hoping we'll make this more consistent and disallow incorrectly structured attribute names entirely.
If you adjust your attribute names to align with the guidance in the SCIM spec's schema RFC and provide the attributes with fully defined URNs, you should see consistent behavior that works on both POST and PATCH.

Deleting data from a Azure Ad user field doesn't trigger a provisioning change in SCIM

I have Scim provisioning setup and connected to azure ad using a custom application which isn't in the marketplace. Provisioning new users and changing data on existing appears to work fine. But when I delete data from a previously synched field, I'm not seeing any change to remove this data in the scim application.
I've tried all number of combinations and checking out documentation for this as a known issue, but have come up short. Does anyone know why this doesn't work?
Mapping
Data deleted from provisioned user
Provisioning User on demand doesn't show any changes
Thank You user3269662 for sharing the right document it will help other member who is looking for the same, currently AAD Provisioning doesn't send null/empty values in almost all cases that is the reason empty/null value of phone number is not reflecting in SCIM application.You can update the value of phone number but can not pass null value, as Microsoft found this is special type of consideration and they are working on that. For knowing about progress on this you can comment on Same MS Q&A post.
WorkAround : You need to manually delete the attribute value from SCIM application if you set null value for any attribute of AAD

What is the WORKSHEETS_APP_USER in Snowflake

While looking at our snowflake.account_usage.login_history in order to identify users with outdated client drivers (using reported_client_type + reported_client_version), I came across this user_name that I did not recognize: WORKSHEETS_APP_USER.
It's not one of our users, so I'm wondering where it's coming from.
The client driver it's using is OTHER 1.1.5.
It's using OAUTH_ACCESS_TOKEN to authenticate (which is not an authentication method we use for Snowflake).
And it's using a ton of different IPs in the 10.4.* range.
It has a lot more logins during the week than during the weekend -- so probably a human(s).
I'm thinking it's probably related to the worksheets UI (either in Snowsight or in the old console).
If so, would there be any way to know who was the original user(s) behind this activity?
The first time Snowsight is accessed in an account, Snowflake creates an internal WORKSHEETS_APP_USER user to support the web interface. This user is used to cache query results in an internal stage in your account. For more information, see Getting Started With Snowsight.
https://docs.snowflake.com/en/sql-reference/account-usage/users.html#usage-notes

How to I access reports programmatically in SalesForce using Apex

I'm trying to write an app on the SalesForce platform that can pull a list of contacts from a report and send them to a web service (say to send them an email or SMS)
The only way I can seem to find to do this is to add the report results to a newly created campaign, and then access that campaign. This seems like the long way around.
Every post I read online says you can't access the reports through Apex, however most or all of these posts were written before Version 20 of the API was released last month, which introduced a new report object. I can now programmatically access info about a report (Such as the date last run etc) but I still can't seem to find a way to access the result data contained in that report.
Does anyone know if there's a way to do that?
After much research into it, I've discovered the only way to do this at the moment is indeed to scrape the CSV document. I would guess that Conga etc are using exactly this method.
We've been doing this for a while now, and it works. The only caveats are:
Salesforce username / password /
security token has to be shared to
the app connecting. If the password
changes (and by default it is changed
every 30 days or so) the token also
changes and must be re-entered.
You have to know the host of the account, which can be difficult to
get right. For instance while most european accounts would use emea.salesforce.com to access CSV, our account uses na7 (North America 7) even though we're located in
ireland. I'm currently sending the page host to the app and parsing it
to calculate the correct subdomain to use, but I think there has to be a
better way to do this.
Salesforce really needs to sort this out by supplying an API call which allows custom report results to be exported on the fly and allowing us to use OAuth to connect to it. But of course, this is unlikely to happen.
In the SalesforceSpring 11 update, it seems you can obtain more informations about the Reports:
As stated in the API for Report and ReportType, you can access via Apex the fields used in the query by the Report, reading the field "columns", as well as the field used to represent the filters called "filter".
Iterating through this objects, should allow you to build a String representing the same query of the Report. After building that string you can make a dynamic query with a Database.query(..) call.
It seems to be a little messy, but should work.. (NOT TESTED YET!)
As header states, this works only with Custom Reports!
Just to clarify for fellow rookies who will find this, when the question was asked you could access your report data programatically, but you had to use some hacky, error prone methods.
This is all fixed, you can now access your reports via the API as of Winter '14.
Documentation here - http://www.salesforce.com/us/developer/docs/api_analytics/index.htm
Go to town on those custom dashboards etc. Cross posted from the Salesforce Stack Exchange - https://salesforce.stackexchange.com/questions/337/can-report-data-be-accessed-programatically/
But Conga (appextremes) do this in their QuickMerge product, where the user specifies the report Id, and the apex script on the page runs the report to extract the results for a mail merge operation.
the v20.0 API added metadata about the reports, but no way to actually run the report and obtain the results. If this is a standard report, or a report you've defined, you can work out the equivalent SOQL query for your report and run that, but if its an end user defined report, there's no way to do this.

Resources