Table Partitions with Entity Framework Core - sql-server

I have a large database that will use partitioned column-store tables in a redesign. Is it possible to specify the partition in the generated sql with Entity Framework Core 2.2?
This is for an Azure SQL hyperscale database with a table which currently contains about 3 billion rows. When using stored procedures to execute the requests performance is excellent but if the partition range is not specified in the query performance is less that optimal. I am hoping to move away from the inline sql we are currently using in the application layer and move to entity framework core. Being able to specify the partition for the tenant is our only blocker at the moment.
This is an example where clause in the stored proceduure
Select #Range = $PARTITION.TenantRange(#InputTenantId)
Select ..... FROM xxx where $PARTITION.TenantRange(TenantId) = #range
The above query would provide excellent performance but I am hoping I can make the same specification of the partition using entity framework.

Related

Can i use Nosql Databases like Mongodb for my Scenario

I have manufacturing company data in a relational form and have around 10 tables each table has more than 300 columns .we have set of operations to be performed (like join ,union ,ranking) sequentially to reach a final aggregated table which will be used for analysis but during joins its crossing more than 1000 columns which is not supported by relational database.What is the best database to use for such scenarios?
do they support all SQL operations?
currently we are using SQL Server

Query Perfornance for querying XML Column in SQL Server

We are thinking of providing some advanced querying capabilities to application.
The application is in Retail domain and storing the core Entities in XML Column for all the main entities(Invoice, Order, Product etc) in the application.
The database is already > 10GB in size.
Will it be good choice to use query against XML Column in SQL Server, what are alternative to these as the core entity data is always stored as XML instead of RDBMS tables.

Add user-defined module into Database Engine to pre-processing T-SQL queries

I write a module to translate 1 sql query into another query. When users send sql queries to DB-Engine, then DB-Engine will firstly forward these queries to my defined-module before processing sql syntax.
How can I integrate my-defined module to DB-Engine of SQL Server?
You can redirect queries for certain data to different tables using a partitioned view:
http://technet.microsoft.com/en-US/library/ms188299(v=SQL.105).aspx
In a nutshell, you tell the server some rules as to which values reside in which tables (usually based on primary or foreign key ranges for example). When you query using the partition field, the database can direct your query to the correct remote table. But you can still do queries over all the tables just as if they were held locally (except more slowly).

SQL server Partitioned table and entity framework

Can we get the benefits of the partitioning of a SQL server 2010 table when we use entity framework as the data layer?
The table will have 10 000 records per day and it will be partitioned by the date created (Ex :- Older than 30 days and new)
I'm not very skilled in SQL Server so perhaps I'm wrong but I believe that table partitioning should be transparent to queries (if we are talking about automatic partition function defined in the table) - it means that common queries should still work and even have better performance if partitioning is configured correctly. So in case of database-frist design, EF should not have any problem with this because it still works with single logical table. If you mean manual partitioning by creating new table each month then it is a big probrem with EF and you will need stored procedures to access that tables.

Are there any in-memory databases that support computed columns?

We have a SQL 2005/2008 database that has a table with a computed column. We're using the computed column as a discriminator in NHibernate so having it in the database is proving to be very useful.
In order to gain the benefits of faster integration tests, I'd like to be able to run our integration tests against an in-memory database such as SQLite or SQL CE. But I don't think either of those support the computed column.
Are there any other solutions to my problem? I have complete access to the database and can modify it if there's a better solution available. I've seen this post that suggests using a view instead of a computed column, is this the best alternative?
What I did was added the computed column to the DataTable when loading the table from SqlCe. I stored the definition of the computed DataColumn in a "configuration" table stored in the database. I was able to do complex calculations that depended on a "chain" of tables, where each table performed a simplier function of a more complex function. (The last table in the chain contained the results.) I used SqlCe because one table of five contained 15 million rows. Too much data for the in-memory data sets of ADO.NET. (I had a requirement of using local client based calculations before posting to server.)

Resources