We are planning to use Azure Cosmos as a DB for Storing a bunch of Documents (JSON Objects), Does Cosmos have any feature to create Views like Mongodb
I searched many places but could not get details
No CosmosDB does not have, it looks like you will have to create another collection.
If you look at the commands, you can see that there is no command to create a view in CosmosDB.
Based on the REST API and Administration Commands, the feature is not supported by cosmos db mongo api yet.(Please see a similar thread:Is it possible to use MongoDB Views with Azure CosmosDB?)
Per my knowledge, CosmosDB just supports a subset of the MongoDB API. CosmosDB has some different behaviours and results. But the onus is on CosmosDB to improve their emulation of MongoDB.
In addition, you could add feedback here to submit your requirements.Or you could consider using MongoDB Atlas on Azure if you'd like full MongoDB feature support.
Related
I am using Google Sheets to create a database that is connected to Google Data Studio. But the database is growing fast and will soon overgrow Sheets limits.
I am looking for a cloud service that is simple to use like Sheets, where I can manually add data, do calculations (like formulas in Sheets) and also use Python to update the data there. I also need it to connect to Google Data Studio for visualisation.
I got recommended Firestore, Cloud SQL, Bigquery, but I still do not understand the difference between them. I am looking for something cheap where I can do the things I mentioned above.
P.S. I am new to SQL, so I would prefer a visual database (like Sheets).
Thank you all!
Sheet is not a database, but you can use as is. You have other type of database on Google Cloud, such as
Firestore a document oriented database, not really similar to a tabular Sheet
BigQuery which is a datawarehouse very powerful and the most similar to sheet in its design, checks and controls
Cloud SQL hosts relational database engine, similar to BigQuery but with, in addition, the capacity to create contraint (unique value, primary key, external (foreign) key in relation with another value in another table.
However, no one offer the easiness of Sheet in term of graphical interface. The engine are powerful but are developer oriented and not desktop user oriented.
As a Web Developer everyday we are hearing about new technologies, recently I came to know about Elastic Search it is used to analyze the big volumes of data. I've my data in Mongo DB weather it is possible to use elastic search on it.
MongoDB Atlas has a feature called 'Atlas Search', which implements the Apache Lucene engine. This could be a solution for your search requirements.
See Atlas Search for details
Depends what you mean by "analyze the big volumes of data", what are your requirements? Don't pay to much attention on marketing slogans. Maybe you can connect Elasticsearch with MongoDB via an ODBC driver. Elasticsearch is a document oriented NoSQL database like MongoDB is. As usual both have their pros and cons.
MongoDB is more like a database, i.e. it supports CRUD (Create, Read, Update, Delete) operations and the Aggregation Framework is very powerful.
In Elasticsearch you can store data and analyze or query it. I remember in earlier releases it was not so easy to delete or update existing single documents.
I'm using Zapier with Redshift to fetch data from custom queries and trigger a wide array of actions when new rows are detected from either a table or custom query, including sending emails through Gmail or Mailchimp, exporting data to Google Sheets, and more. Zapier's UI enables our non-technical product stakeholders to take over these workflows and customize them as needed. Zapier has several integrations built for Postgres, and since Redshift supports the Postgres protocol, these custom workflows can be easily built in Zapier.
I'm switching our data warehouse from Redshift to Snowflake and the final obstacle is moving these Zapier Integrations. Snowflake doesn't support the Postgres protocol so it cannot be used as a drop in replacement for these workflows. No other data source has all the information that we need for these workflows so connecting to an upstream datasource of Snowflake is not an option. Would appreciate guidance on alternatives I could pursue, including the following:
Moving these workflows into application code
Using a foreign data wrapper in Postgres for Snowflake to continue using the existing workflows from a dummy Postgres instance
Using custom-code blocks in Zapier instead of the Postgres integration
I'm not sure if Snowflake has an API that will allow you to do what you want, but you can create a private Zapier Integration that will have all the same features and permissions as a public integration, but you can customize it for your team.
There's info about that process here: https://platform.zapier.com/
You might find it easier to use a vendor solution like Census to forward rows as events to Zapier. Their free plan is pretty sizeable for getting started. More info here https://www.getcensus.com/integrations/zapier
I have gone through many documents where indices are created but the APIs used are SQL API for Azure cosmos but no example for Gremlin API. Any help would be appreciated. Thanks!!
Azure Cosmos DB is a schema-agnostic database that allows you to iterate on your application without having to deal with schema or index management. By default, Azure Cosmos DB automatically indexes every property for all items in your container without having to define any schema or configure secondary indexes.
For more details, we can see Indexing policies in Azure Cosmos DB.
Azure Cosmos DB support Gremlin API. You can use the indices for your Gremlin queries.
Reference: Querying with indexes
Hope this helps.
We came with a requirement to Search SQL table which contain documents data in Image/binary column type. we are trying to do this with Elastic-search and Azure Search. we can able to proceed with Elastic-search but hit roadblock on Azure Search as indexing is not possible for these data types thru indexer.
can any body help us, is there any possibilities to achieve this with Azure Search?
Please see my response to your question on MSDN.
In short, currently, in order to use Azure Search built-in document extraction capabilities, the files need to be stored in Azure blob storage. Then, you can use the blob indexer.