Salesforce Static Resource Limit 250MB - salesforce

I am trying to install a managed package on a Salesforce org. But I get the Error Unable to install the Managed package due exceeding of Static resource limit.
Our org's static resource has reached around 350MB. But as per the Salesforce documentation an Org's static resource limit is 250MB. When I run a query I see the unmanaged static resource in the org as just 30MB. Is there any Salesforce documentation which shows the Salesforce static resource limit of 250MB is inclusive of both managed & unmanaged static resources? Need some good evidence to show it to my other team members.

Yes see documentation below. From my experience Managed Packages (including Salesforce managed packages) usually count for the most usage.
Static resources in managed packages do count against the 250 MB per Org limit. Attempting to install a managed package that would cause your Org to exceed the static resources limit will cause the package install to fail.
Knowledge Article Number
000316981
https://help.salesforce.com/s/articleView?id=000316981&type=1

Related

Free tier, gcloud source repos clone : This API method requires billing to be enabled

I created a Google App Engine years ago. It is using standard Python 2.7, and I created it using the free tier. There is a simple Python project (3~4 files), and today I wanted to add a file to it for testing. I have not used it for so long that I forgot how I did it before. Anyway, I tried gcloud source repos clone or gcloud source repos list, but all those commands gave same error.
Error: (gcloud.source.repos.clone) PERMISSION_DENIED: This API method requires billing to be enabled. Pleae enable billing on project #(number) by visiting https://console...
I rechecked their pricing page, and it seams they are still providing free tier. Then, why does it require billing just to download the source files of my own project or even the python-gae-quickstart. Did I do somethin wrong, or does it now require billing for even free tiers?
The free tier is based upon usage of certain resources, they need you to have a valid and enabled Billing account on the project in case you exceed the usage of the Free Tier.
For example, according to the docs, App Engine has 1 GB of egress per day; however, in case you exceed said daily usage, you will be billed for the difference.
Keep in mind that they do not know what happens on the project, so even if you just want to download something or are not going to use any API, they will still ask you for a billing account.
If you have more doubts, contact their billing support over here, and they should be able to provide further help.
Hope this helps!

A plea for a basic Notebook example getting data into and out of Google Cloud Datalab

I have started to try to use the Google Cloud datalab. While I understand it is a Beta product, I find the Doc's very frustrating, to say the least.
The questions here and lack of responses as well as lack of new revisions or docs over the several months the project has been available make me wonder if there is any commitment to the product?
A beginning would be a notebook that shows data ingestion from external sources to both the datastore system and the Big query system. That is a common use case. I'd like to use my own data, it would be great to have a Notebook to ingest it. It seems that should be doable without huge effort? And it would get me (and others) out of this mess trying to link the various terse docs from various products and workspaces up and working together..
in addition to a better explanation of the Git hub connection process (prior question))
For BigQuery, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/BigQuery/Importing%20and%20Exporting%20Data.ipynb
For GCS, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/Storage/Storage%20Commands.ipynb
Those are the only two storage options currently supported in Datalab (which should not be used in any event for large scale data transfers; these are for small scale transfers that can fit in memory in the Datalab VM).
For Git support, see https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/intro/Using%20Datalab%20-%20Managing%20Notebooks%20with%20Git.ipynb. Note that this has nothing to do with Github, however.
As for the low level of activity recently, that is because we have been heads down getting ready for GCP Next (which happens this coming week). Once that is over we should be able to migrate a number of new features over to Datalab and get a new public release out soon.
Datalab isn't running on your local machine. Just the presentation part is in your browser. So if you mean the browser client machine, that wouldn't be a good solution - you'd be moving data from the local machine to a VM which is running the Datalab Python code (and this VM has limited storage space), and then moving it again to the real destination. Instead, you should use the cloud console or (preferably) gcloud command line on your local machine for this.

SALESFORCE QUERY ALL - NoClassDefFoundError

When I try to execute SALESFORCE QUERY ALL activity(in server), I get java.lang.NoClassDefFoundError. From Google NoClassDefFoundError in Java comes when Java Virtual Machine is not able to find a particular class at runtime which was available during compile time.
But not sure about the exact resolution in Tibco. Is it an issue with the installation of salesforce plugin
I think this due to a missing JDBC driver for your database vendor.
Check this

IllegalAccessException on protected class member while parsing Excel 2007 file using Apache POI library on AppEngine

I am trying to parse excel 2007 (.xlsx) file using Apache POI library on Google AppEngine but while doing that I am getting an exception (see below).
java.lang.IllegalAccessException: Class com.google.appengine.tools.development.agent.runtime.Runtime$21 can not access a member of class org.apache.poi.xssf.usermodel.XSSFSheet with modifiers "protected"
So I checked with Apache POI team, but they claim that its an AppEngine issue. I am not sure what is the right place for AppEngine questions, but I know lot of appengine developers monitor Stackoverflow. So posting this question here.
Bug filed for Apache POI team : https://issues.apache.org/bugzilla/show_bug.cgi?id=55665
This bug has a sample maven project, and instructions to reproduce it.
I am not sure how to attach this zip file here.
If any one knows how to fix this then let me know, or right place to file bug.
The key part of the stacktrace is:
java.lang.IllegalAccessException: Class com.google.appengine.tools.development.agent.runtime.Runtime$21 can not access a member of class org.apache.poi.xssf.usermodel.XSSFSheet with modifiers "protected"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:105)
at com.google.appengine.tools.development.agent.runtime.Runtime$22.run(Runtime.java:488)
at java.security.AccessController.doPrivileged(Native Method)
at com.google.appengine.tools.development.agent.runtime.Runtime.checkAccess(Runtime.java:485)
at com.google.appengine.tools.development.agent.runtime.Runtime.checkAccess(Runtime.java:479)
at com.google.appengine.tools.development.agent.runtime.Runtime.newInstance_(Runtime.java:123)
at com.google.appengine.tools.development.agent.runtime.Runtime.newInstance(Runtime.java:135)
at org.apache.poi.xssf.usermodel.XSSFFactory.createDocumentPart(XSSFFactory.java:60)
I've run into the same issue. I think this is only an issue with the development server. Admittedly, this doesn't fully answer your question but I guess the situation at least isn't as bas as you'd think. To get around the issue I've been developing my POI code in a standard Java project (using dummy data) and then copying it into the App Engine project.
I've logged the issue with Google: https://code.google.com/p/googleappengine/issues/detail?id=11752
If you're interested, in the process of logging the issue, I created a sample project which is also available on App Engine (which works as it's running in the production environment).
Sample project: https://bitbucket.org/bronze/jakarta-poi-issue
App running on production environment: http://bronze-gae-poi-issue.appspot.com/

User uploads with fixed per user quota in DotNetNuke

I'm running a DotNetNuke 7.0 Community Edition installation and I'm currently looking for a way of allowing users to upload own content into their very own directory. I would also like the users to have a maximum storage limit of for instance 2GB. Perhaps there already is an in-built solution for this scenario but I'm also willing to spend money for a commercial module.
So I've not found an on-board setting allowing me to set a per-user-quota, neither have I been able to find a module available in the store http://store.dnnsoftware.com for several hours now.
I even decompiled the DotNetNuke.dll in my installation directory and noticed it has members called UserQuota in DotNetNuke.Portals.PortalSettings and DotNetNuke.Entities.Portals.PortalInfo but I still failed to find where to define a quota for my users. Is this a Professional/Enterprise feature only by any chance?
Any help would be greatly appreciated. If there's no such module I can also write a custom module, but instead of reinvent the wheel I'd love to hear your ideas first.
Thanks.
For future reference:
I ended up coding a custom DNN upload plugin which stores all files users upload into their own directory and controls the maximum storage space each of these users has. If you need this for an own project just drop me a message for the .zip.
DotNetNuke has a portal level quota for file-space that you can set. This is available under "Admin" -> "Site Settings" -> "Host Settings" (On Advanced Tab).
However, this is for the entire portal. I am not aware of any user specific, or folder specific quota mechanism for DotNetNuke.

Resources