Will full-text search index be limited in size? - google-app-engine

We are about to migrate our app to FTS to replace datastore indexes.
Will there be a limit in index size once the FTS is our of experimental state or is it unlimited like the datastore?

Once the Search API is out of experimental, there should be no limit on index size.

As of App Engine version 1.9.0: "The size limit on the Search API is now computed and enforced on a per-index basis, rather than for the app as a whole. The per-index limit is now 10GB. There is no fixed limit on the number of indexes, or on the total amount of Search API storage an application may use."

Related

Solr Cloud Index Sizes

what is the Solr Cloud equivalent to ElasticSearch's _cat queries to determine the index sizes?
In ElasticSearch I can simply call
http://localhost:9200/_cat/indices/?v&s=index
or
http://localhost:9200/_cat/shards/?v
to see how much space (in bytes) a index or a shard uses. The only way I found in Solr cloud so far is to add up file sizes on every node. Is there any way to ask Solr cloud directly for this information?

Is there a cost to using google realtime api? What is the user limit? What is the cost to add users?

What is the cost to use google realtime api for a company?
Is there a cost if the number of users go up?
I think the Realtime API is free. However what you need to be aware of is its Size restrictions and best practices.
The Realtime API limits the maximum sizes of Realtime documents and
individual changes. These limits are:
Total Realtime document size: 10 MB (10*2^10 bytes) Maximum change
size: 500 KB
-Avoid storing large binary objects, like images, within the data model. Instead, you can link to files stored separately in Google
Drive, or on an external server.

Google App engine - Search API 10GB per-index limit

In the app engine release notes 1.9.0 is stated:
"The size limit on the Search API is now computed and enforced on a per-index basis, rather than for the app as a whole. The per-index limit is now 10GB. There is no fixed limit on the number of indexes, or on the total amount of Search API storage an application may use."
The Search API is now in experimental state, but I would like to know if the 10GB per-index limit will be removed when the Search API will be out of experimental (or at least, will be replaced with a much larger one).
As indicated in Search API quotas, the current quota is still 10 GB per index. There are no public plans to increase this quota at the time of this writing.
Should this quota increase be desired, feel free to file a new feature request on the App Engine public issue tracker detailing a thorough business case. You may also want to consider supporting this existing and related issue: Issue 10667: Search API multiple index search.

Reduce the size of Google app engine Datastore Stored Data

I'm using Google app engine and now the size of the "Datastore Stored Data" is near to exceed the free quota limit. so i want to reduce the size of the data in the Datastore by removing some entity elements.
I have tried deleting some entity elements that cost about 100MB (abt 10% from 1GB limit) , but it still shows the earlier usage and it still near to exceed the free quota limit.
Please advice me, how to reduce the data store size.
Thanks in advance.
Nalaka
To reduce the size in your case:
1) NDB can compress properties, so you can create an object for the non indexed properties and compress it: https://developers.google.com/appengine/docs/python/ndb/properties?hl=nl
2) I do not know your models. But an option is to distribute your models and create a webservice to fetch entities from the other appids.
3) If it is only one model, you keep the indexed properties in your primary appid and fetch the data from the secondary appid.
But of course, everything has a price. Performance, url fetches, CPU ... So it is easy to run from one bottleneck in another.

Google App Engine Database : disk usage

I have a Google App engine Application storing 135 MBytes into its datastore, however when I check my quotas It tells me that I'm using 76% of my Free 1gb of stored data.
Is it because of the index ? How can it use so much diskspace?
Thanks
It could be due to indexes. Every property (with exception of some types) has "single property" indexes unless you explicitly disable indexing of that property. Since the indexes store the property name and the value, the impact on storage space can be quite significant. If you would like statistics on your index usage, star issue 2740.
If you are using a lot of tasks, your stored task bytes also counts against your storage usage. Also note that blobstore usage counts against your storage quota.

Resources