I'm new to SQL Server and I wonder:
I have built some tables that display the documents that produced in my program.
I need to write a stored procedure that inserts a document to the tables with transaction.
I think to create a main procedure with transaction that get 2 DTU of table:
main and details
My question is:
is it valid to create many DTU of table, for example:
if in my DB their is 10 tables i need create 10 data type user of tables?
How can I use polymorphism in the procedure parameter ,so I can write a one procedure that get for example:
if i have 2 tables person and teachers that all teacher is person so my parameter will be alwase as person type but i allow send also teacher type ?
After I saw the XML type but its more slowly to use that and also it is more difficult
I wonder if SQL Server has other solutions to write to multiple tables with one transaction that would require fewer parameters?
Thanks for advance and hope that you will help me
It all depends on your front-end. If you have two different UI to capture header and details separately. You would need two separate stored procs, if not one stored proc would suffice.
User Defined table type can decrease the number of params to stored proc. I agree XML is complex and user defined table type is much easier to use.
Related
I have a SSRS report with 20 different datasets with some calculated columns in each.
I want to take few fields from all data sets including some calculated columns and insert them into a SQL table.
I want to do this for each month so that I can see the trends during a period. Is there any way to do that with out editing the data sets?
Can I refer the fields that I need by referring to Textbox4 and insert them into a SQL table? What is the easy way to do without touching data sets?
There is most likely alot better solution than using SSRS to update a SQL database. I am not proposing this as the best solution but rather a way to achieve what was asked.
You could create a dataset that runs a stored procedure you can pass the data as parameters to. The Sproc would do the insert into your chosen table and you can pass in the parameters from your original dataset however you see fit. You could even setup a second report with the Stored procedure Dataset that you can call on command by having an action event on an item to call the report. (I had a subreport embedded in a column of a tablix configured so it would only update with values from that row for instance).
To clarify:
Create a subreport that accepts the data you want to insert as a parameter for each column
Instead of adding a normal Dataset, have it call a stored procedure that inserts as you require
Add the subreport to your main report to be called once its run and configure the required parameters to be passed through.
There will be better, more efficient, cleaner ways to do this, but I found the above to work for my purposes since I was limited by time and resources. But I would still recommend you seek other solutions if possible.
My team is creating a high volume data processing tool. The idea is to take a 30,000 line batch file and bulk load it into a table and then process the records use parallel processing.
The part I'm stuck on is creating dynamic tables. We want to create a new physical table for each file that we receive. The tables will be purged from our system by a separate process after they are completed.
The part I'm stuck on is creating dynamic tables. For each batch file we receive I need to create a new physical file with a unique table name.
I have the base structure for the table and I intend to create unique table names using a combination of date/time stamp and a guid (dashes converted to underscore characters).
I could do this easily enough in a stored procedure but I'm wondering if there is a better way.
Here is what I have considered...
Templates in SQL Server Management Studio. This is a GUI tool built into Management Studio (from Management Studio Ctrl+Alt+T) that allows you to define different sql objects including a table and specify parameters. This seems like it would work, however it appears that this is a GUI tool and not something that I could call from a stored procedure.
Stored Procedure. I could put everything into a stored procedure and build my file name and schema into a nvarchar(max) string and use sp_executesql to create the table. This might be the way to accomplish my goal but I wonder if there is a better way.
Stored Procedure with an existing table as a template. I could define a base table and then query sys.columns & sys.dataypes to create a string representing the new table. This would allow me to add columns to the base table without having to update my stored procedure. I'm not sure if this is a better approach.
I'm wondering if any Stack Overflow folks have solved a similar requirements. What are your recommendations.
The setup
I have the following database setup:
CentralDB
Table: Stores
Table: Users
Store1DB
Table: Orders
Store2DB
Table: Orders
Store3DB
Table: Orders
Store4DB
Table: Orders
... etc
CentralDB contains the users, logging and a Stores table with the name of each store database and general information about each store such as address, name, description, image, etc...
All the StoreDB's use the same structure just different data.
It is important to know that the list of stores will shrink and increase in the future.
The main client communicating with this setup is an API REST Service which gets passed a STOREID in the Header of each request telling it which database to connect to. This works flawlessly so far.
The reasoning
Whenever we need to do database maintenance on one store, we don't want all other stores to be down.
Backup management should be per store
Not having to write the WHERE storeID=x every time and for every table
Performance: each store could run on its own database server if the need arises
The goal
I need my REST API Service to somehow get all orders from all stores in one query.
Will you help me figure out a way to do this without hardcoding all storedb names? I was thinking about a stored procedure on the CentralDB but I was hoping there would be other solutions. In any case it has to be very efficient.
One option would be to have a list of databases stored in a "system" table in CentralDB.
Then you could create a stored procedure that would read the database names from the table, loop through them with cursor and generate a dynamic SQL that would UNION the results from all the databases. This way you would get a single recordset of results.
However, this database design is IMHO flawed. There is no reason for using multiple databases to store data that belongs to the same "domain". All the reasons that you have mentioned can be solved by using a single database with proper database design. Having multiple databases will create multiple problems on the long term:
you will need to change structure of all the DBs when you modify your database model
you will need to create/drop new databases when new stores are added/removed from your system
you will need to have items and other entities that are "common" to all the stores duplicated in all the DBs
what about reporting requirements (e.g. get sales data for stores 1 and 2 together, etc.) - this will require creating complex union queries...
etc...
On the long term, managing and maintaining this model will be a big pain.
I'd maintain a set of views that UNION ALL all the data. Every time a store is added or deleted those views must be updated. This can be automated.
The views provide an illusion to the application that there is only one database.
What I would not do is have each SQL query or procedure query all the database names and create dynamic SQL. That would entail lots of code duplication and an unnecessary loss of performance. This approach is error prone. Better generate code once in a central place and have all other SQL code reference that generated code.
I have a table with about 30 fields. I current have several stored procedures which access either a (aggregated) view of this table or the table itself. For many of these SPs I would like to assure that the returned records have all the same fields with the same column names. Is there a way to do this where I don't have to change 20 stored procs if I do need to change the output.
My way around it thus far is to provide clients with lists of ID which they then call SP's that return the data however this seems to be slow compared with getting the data in one shot. I have also considered using the formatting stored procs with a cursor from inside the search stored procs but was unsure if that would really buy me a whole lot.
The typical way to define a standardised and consistent data access method across multiple stored procedures in SQL Server to use Views.
Now your problem description seems to suggest that you are already using Views in order to manage your data access. If you are indeed unable to use Views for a specific reason, perhaps you can clarify the nature of your problem further for us.
How many types of parameters are there in a stored procedures and what are they?
Thanks in advance.
And can we delete a table using view? I think yes but in what situation we can't delete it if there are no trigger associated with that table. I mean to say i need to delete a table which has no trigger associated with it using view, in which case i can't delete it?
You have basically three types of parameters for stored procedures:
Input
Output
InputOutput
Is that what you're looking for??
Also, I don't totally understand what you're asking with your second question? You want to "delete a table" ?? You don't delete tables - you DROP tables. And you can't use a view to drop a table..... or do you mean: can you delete rows from a table through a view?
SQL Server views can indeed be used to modify data - to a certain degree, and by obeying a certain set of rules. Read more about that on MSDN under Modifying Data Through a View.