Quota of AAD Groups - azure-active-directory

I understand there is quota on the number of AAD Groups, it seems that it’s 250 as default.
https://learn.microsoft.com/en-us/azure/active-directory/users-groups-roles/directory-service-limits-restrictions
--> A user can create a maximum of 250 groups in an Azure AD organization.
And I believe this number can be increased by “Increase request” on portal.
[Resolve errors for resource quotas]
https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/error-resource-quota
My question is, how much number can AAD groups currently be increased as maximum.
As an instance, is it possible to increase maximum number of AAD groups to 1000?
Your help is highly appreciated.
Thank you in advance.

As far as I know, there is no specific limit about the number of groups. The official document you provided just says one user can create maximum of 250 groups in Azure AD organization. So I think there is no problem if you want to create 1000 groups(with multiple users).
Here is a stack overflow post which is related to your question, in this post the OP create 3000 groups.

Related

Limits for simultaneous connections Firebase

I am new with Firebase and I have heard that there is a limit for simultaneous connections.
My app doesn't require to store who is online at a specific moment, just some details about each account(name, picture, etc.). So, I do not know if those limitations will apply to this type of app or not. If they do, what is the limit? In a previous post somebody said it is 100,000 users, but on their documentation I have found something about 1,000,000 users (Maximum concurrent connections for mobile/web clients per database 1,000,000). Furthermore, there is no mention for simultaneous connection on their price calculator.
Also, if I have misunderstood anything about "simultaneous connections", please explain here.
Have a good day and thank you for your time!
There is no cost for simultaneous collections. See my answer to Price for simultaneous database connection in firebase.
The Firebase Realtime Database allows up to 100 concurrent connections per project on its free plan, and up to 200,000 concurrent connections per database on its paid plans.
Cloud Firestore has no limit on the number of concurrent connections, neither on the free nor the paid plan.
These limits is regardless of the application type, and depends on how your code uses the database. Since you didn't share (and it seems didn't write) any code yet, it's hard to say more.

Grafana Monitoring Active DB Sessions - Dynamically

I've been thinking about how to monitor the active oracle sessions on the V$session view.
The things i'd like to do are:
Graph (through a simple grafana graph panel) the sum of all active sessions every 5 minutes. I'm already working on a Perl script to do this.
Extract one or more "owners" of the highest number of active sessions on each poll and visualize them as a popup on the graph or in some other way.
I just want to know who is causing the high amount of sessions on the db.
Is it possible to do this? I hope so :D
Thanks in advance!
How many users do you have? If it's less than say 1000 my suggestion would be to have your script report the number of active sessions for each user separately, that way you can easily visualize not only the total active sessions but also the active sessions for each user.

"Daily Usage" Quota is a per-project limit?

I have a question about Gmail API limit.
About "Daily Usage" Quota of the URL below, Is this a per-project limit?
Is there a way to relax this limitation?
https://developers.google.com/gmail/api/v1/reference/quota
We currently develop services using the Gmail API for multiple Bussiness companies.
There is concern that it will exceed when the assumed total number of users becomes large, and I am asking such a question.
Is it possible to create different project for each customer and avoid the limit?
In that case, do I have to apply "OAuth Developer Verification" for each project?
https://support.google.com/code/contact/oauth_app_verification
Understanding the quotas can be a little hard. The way i remember is if the name has user in it its a user quota
This one is project based.
Daily Usage 1,000,000,000 quota units per day
This one is user based.
Per User Rate Limit 250 quota units per user per second, moving average (allows short bursts)
Some quotas can be increased you should check the quota section of the google developers console. under Enabled APIs and services click the api in question and go to the quotas tab in the new window. if there is a pencil icon you can increase it.

Is there a maximum number of write operations per minute?

Somehow I got in mind that the Data Store of the Google App Engine only allows 1000 writes / minute.
After I couldn't find any information on the web or in the quota docs I just wanted if someone can verify this information. Thanks
I've never seen any mention of this. There's a 1 write per second limit on an entity group, but you should be able to write to a very large number of entity groups at the same time.

Google app engine excessive small datastore operations

I'm having some trouble with the google app engine datastore. Ever since the new pricing model was introduced, the cost of running my app has increased massively.
The culprit appears to be "Datastore small operations", which come in at more than 20 Million ops per day!
Has anyone had this problem, I don't think I'm doing an excessive amount of key lookups, and I only have 5000 users, with roughly 10 - 20 requests per minute.
Thanks in advance!
Edit
Ok got some stats, these are after abut 3 hours. Here is what I am seeing in my dashboard, in the billing section:
And here are some of the stats:
Obviously there are quite a lot of calls to datastore.get. I am starting to think that it is my design that is causing the problem. Those gets correspond to accounts. Every user has an account, but an account can be one of two types, for this I use composition. So each account entity has a link to its sub account entity.
As a result when I do a search for nearby users it involves fetching the accounts using the query, and then doing a get on each account to get its sub account. The top request in the stats picture is a call that gets 100 accounts, and then has to do a get on each one. I would have thought that this was a very light query, but I guess not. And I am still confused by the number of datastore small ops being recorded in my dashboard.
Definitely use appstats as Drew suggests; regardless of what library you're using, it will tell you what operations your handlers are doing. The most likely culprits are keys-only queries and count operations.
My advice would be to use AppStats (Python / Java) to profile your traffic and figure out which handler is generating the most datastore ops. If you post the code here we can potentially suggest optimizations.
Don't scan your datastore, use get(key) or get_by_id(id) or get_by_key_name(keyname) as much as you can.
Do you have lots of ReferenceProperty properties in your models? Accessing them will trigger db.get for each property unless you prefetch them. This would trigger 101 db.get requests.
class Foo(db.Model):
user = db.ReferenceProperty(User)
foos = Foo.all().fetch(100)
for f in foos:
print f.user.name # this triggers db.get(parent=f, key=f.user)

Resources