HBM2DDL -- Create a database view instead of a Table? - database

All,
Is there some setting that I can tell hbm2ddl to run a view creation statement instead of create a table when generating the database schema?
I'm creating my database schema using the wonderful hbm2ddl tool, but I have one issue. I need to flatten some of the tables into views to aid searching the database, and hql would be overly complex a solution. I've created Entity objects pointed at these views, in order to fetch search results via hibernate. This all works fine, until hbm2ddl is used. In an empty database schema, hbm2ddl will create the database schema based on the jpa annotations, unfortunately, it will also create my views as tables. Is there some setting that I can tell hbm2ddl to run a view creation statement instead of create a table? In lieu of that, is there a way to tell hbm2ddl to skip table creation for an entity (exclude, or something)?
Thanks!

To my knowledge, and this is unfortunate, Hibernate doesn't support things like creating views instead of tables nor validating a schema containing views. See issues like HHH-1872, HHH-2018 or HHH-1329.

Related

How to create schema bound, cross-database view in SQL Server

I am attempting to create an indexed view on SQL Server 2008. I have a master database in which I cannot make any changes to (in terms of adding tables, views, etc.). However, I need to create some different views for various reasons that need to work with live data.
I have created a new database along side my master database so I can create views there. I am able to create views just fine, but I want to index some of the larger views. However, when I try to create a schema bound view cross-database, I receive the following error:
Cannot schema bind view 'dbo.Divisions' because name
'master.dbo.hbs_fsdv' is invalid for schema binding. Names must be in
two-part format and an object cannot reference itself.
Since I am going cross-database with the views, I have to reference the name in three-part format.
My creation statement for the view:
CREATE VIEW dbo.Divisions WITH SCHEMABINDING AS
SELECT master.dbo.hbs_fsdv.seq_ AS DivisionID,
master.dbo.hbs_fsdv.fs_division_desc_ AS Description
FROM master.dbo.hbs_fsdv
How can I create an indexed cross-database view in SQL Server?
Plain and simple. You can't. From the MSDN page:
The view must reference only base tables that are in the same database as the view.
https://msdn.microsoft.com/en-us/library/ms191432.aspx
Although (per the docs) it cannot be done directly with a simple SQL statement, this use case is very common and has a solution.
The architecture would have to involve caching the remote tables into your centralized database, and building the indexed view on top of them.
Some good notes on this can be found here:
What is the best way to cache a table from a (SQL) linked server view?
and
https://thomaslarock.com/2013/05/top-3-performance-killers-for-linked-server-queries/

SqlServer: How to get meta-data about tables and their relationships?

I was wondering if there was a way (relatively simple I hope) to get information about the table and its attributes and realtionships?
Clarification: I want to grab all tables in the database and get the meta-model for the whole database, tables, column data, indicies, unique constraints, relationships between tables etc.
The system has a data dictionary in sys.tables, sys.columns, sys.indexes and various other tables. You can query these tables to get metadata about the database structure. This posting has a script I wrote a few years ago to reverse engineer a database schema. If you take a look at it you can see some examples of how to use the system data dictionary tables.
there are a whole bunch of system views in the information_schema schema in sql server 2005+. is there anything in particular you're wanting?
some of those views include:
check_contraints,
columns,
tables,
views
Try sp_help <tablename>. This will show you foreign key refrences and data about the columns, etc - that is, if you are interested in a specific table, as your question seemed to indicate.
If using .NET code is an option SMO is the best way to do it.
It abstracts away all these system views and tables hiding them behind nice and easy to use classes and collections.
http://msdn.microsoft.com/en-us/library/ms162169.aspx
This is the same infrastructure SQL Server Management Studio uses itself. It even supports scripting.
Abstraction comes at a cost though so you need maximum performance you'd still have to use system views and custom SQL.
You can try to use this library db-meta

How to put Database Tables in MVC Models

I'm new in developing in MVC and I have some doubts in how to build the Models. I don't know if I must build them in terms of Database Tables or in other terms.
I have for example this 5 tables:
Domain
Category_Domains
Countries_Domains
Categories
Countries
How can i Build a Models to do actions in these Database Tables. Should I built the SQL commands to Insert, Delete and Update for each one of these Tables or I should do other thing than this?
Sorry if I not explain well.
The best advice I can give you is to look into the entity framework. Using this framework, you can create a data connection to your SQL database and the framework will create your model based on your table structure, including relationships.
Doing this will ensure your database is modelled correctly and kept in sync at all times

Grouping ETL Staging Tables With User Schemas?

I was thinking of putting staging tables and stored procedures that update those tables into their own schema. Such that when importing data from SomeTable to the datawarehouse, I would run a Initial.StageSomeTable procedure which would insert the data into the Initial.SomeTable table. This way all the procs and tables dealing with the Initial staging are grouped together. Then I'd have a Validation schema for that stage of the ETL, etc.
This seems cleaner than trying to uniquely name all these very similar tables, since each table will have multiple instances of itself throughout the staging process.
Question: Is using a user schema to group tables/procs/views together an appropriate use of user schemas in MS SQL Server? Or are user schemas supposed to be used for security, such as grouping permissions together for objects?
This is actually a recommended practice. Take a look at the Microsoft Business Intelligence ETL Design Practices from the Project Real. You will find (download doc from the first link) that they use quite a few schemata to group and identify objects in the warehouse.
In addition to dbo and etl, they also use admin, audit, part, olap and a few more.
I think it's appropriate enough, it doesn't really matter, you could use another database if you liked which is actually what we do.
I'm not sure why you would want a validation schema though, what are you going to do there?
Both the reasons you list (purpose/intent, security) are valid reasons to use schemas. Once you start using them, you should always specify schema when referencing an object (although I'm lazy and never specify dbo).
One trick we use is to have the same-named table in each of several schemas, combined with table partitioning (available in SQL 2005 and up). Load the data in first schema, then when it's validated "swap" the partition into dbo--after swapping the dbo partition into a "dumpster" schema copy of the table. Net Production downtime is measured in seconds, and it's all carefully wrapped in a declared transaction.

How to partially migrate a database to a new system over time?

We are in the process of a multi-year project where we're building a new system and a new database to eventually replace the old system and database. The users are using the new and old systems as we're changing them.
The problem we keep running into is when an object in one system is dependent on an object in the other system. We've been using views, but have run into a limitation with one of the technologies (Entity Framework) and are considering other options.
The other option we're looking at right now is replication. My boss isn't excited about the extra maintenance that would cause. So, what other options are there for getting dependent data into the database that needs it?
Update:
The technologies we're using are SQL Server 2008 and Entity Framework. Both databases are within the same sql server instance so linked servers shouldn't be necessary.
The limitation we're facing with Entity Framework is we can't seem to create the relationships between the table-based-entities and the view-based-entities. No relationship can exist in the database between a view and a table, as far as I know, so the edmx diagram can't infer it. And I cannot seem to create the relationship manually without getting errors. It thinks all columns in the view are keys.
If I leave it that way I get an error like this for each column in the view:
Association End key property [...] is
not mapped.
If I try to change the "Entity Key" property to false on the columns that are not the key I get this error:
All the key properties of the
EntitySet [...] must be mapped to all
the key properties [...] of table
viewName.
According to this forum post it sounds like a limitation of the Entity Framework.
Update #2
I should also mention the main limitation of the Entity Framework is that it only supports one database at a time. So we need the old data to appear to be in the new database for the Entity Framework to see it. We only need read access of the old system data in the new system.
You can use linked server queries to leave the data where it is, but connect to it from the other db.
Depending on how up-to-date the data in each db needs to be & if one data source can remain read-only you can:
Use the Database Copy Wizard to create an SSIS package
that you can run periodically as a SQL Agent Task
Use snapshot replication
Create a custom BCP in/out process
to get the data to the other db
Use transactional replication, which
can be near-realtime.
If data needs to be read-write in both database then you can use:
transactional replication with
update subscriptions
merge replication
As you go down the list the amount of work involved in maintaining the solution increases. Using linked server queries will work best if its the right fit for what you're trying to achieve.
EDIT: If they're the same server then as suggested by another user you should be able to access the table with servername.databasename.schema.tablename Looks like it's an entity-framework issues & not a db issue.
I don't know about EntityToSql but I know in LinqToSql you can connect to multiple databases/servers in one .dbml if you prefix the tables with:
ServerName.DatabaseName.SchemaName.TableName
MyServer.MyOldDatabase.dbo.Customers
I have been able to click on a table in the .dbml and copy and paste it into the .dbml of the alternate project prefix the name and set up the relationships and it works... like I said this was in LinqToSql, though have not tried it with EntityToSql. I would give it shot before you go though all the work of replication and such.
If Linq-to-Entities cannot cross DB's then Replication or something that emulates it is the only thing that will work.
For performance purposes you probably want either Merge replication or Transactional with queued (not immediate) updating.
Thanks for the responses. We're going to try adding triggers to the old database tables to insert/update/delete records in the new tables of the new database. This way we can continue to use Entity Framework and also do any data transformations we need.
Once the UI functions move over to the new system for a particular feature, we'll remove the table from the old database and add a view to the old database with the same name that points to the new database table for backwards compatibility.
One thing that I realized needs to happen before we can do this is we have to search all our code and sql for ##Identity and replace it with scope_identity() so the triggers don't mess up the Ids in the old system.

Resources