Could someone explain to me why there such a difference when it comes to filling and updating DataSet?
Currently I am working on program that takes data from one db and puts them into second db. Original DB contains quite a lot of data. I wonder why it is so fast for program to fill DataSet with data, but so long to update.
Code schema looks like that:
Create Connections;
OpenConnections();
Create DataAdapter1, DataAdapter2;
Create CommandBuilder1, CommandBuilder2;
Create DataSet;
DataAdapter1.Fill(DataSet, Connection1);
DataAdapter2.Update(DataSet, Connection2);
CloseConnections();
Related
I would like to copy all data from one DB table to another DB.
I have created two connections but am not sure of what I need to do regarding the actual SQL to make this happen.
$one = DB::connection('mysql');
$two = DB::connection('mysql_2');
DB::statement("CREATE TABLE {$one}.products LIKE {$two}.products;");
Not completely unexpectedly, the response to this is
Object of class Illuminate\Database\MySqlConnection could not be converted to string
Is this even possible between two DBs?
Besides inserting each row into the new table, what are some other options for copying a large set of data from one DB to another?
I have been developing programs in VB.NET for a few years and am familiar with it. The area where I do not have a lot of exposure is databases.
I am writing a program (for my personal use) called movie manager. It will store information on movies I have. I have selected Sql Server Compact Edition database. Assume I have a database with two tables namely Info and Cast. Info table has a few columns such as movie_name, release_date and so on. Cast table has few cols such as first_name,last_name etc.
Right now I have created a DataSet which reads all the info of tables from database (opens connection, fills tables info, closes connection). This way in a global variable I have a snapshot of database.
My queries :
Once I have data with me, every time I need to add, edit or delete a record I have to open a connection, fire an sql and close the connection. Right ? Is there a way to do this without using Sql ? Plus is this concept okay.
Since I am not using structures so I need to create empty datasets to store temp information. Is this convenient ?
If I have to search for a specific thing in dataset table, then do I have to loop thru all items or can I use sql on dataset or is there an alternate ?
1)Once I have data with me, every time I need to add, edit or delete a record I have to open a connection, fire an sql and close the connection. Right ? Is there a way to do this without using Sql ? Plus is this concept okay.
No. To update a database, you have to use the database. Create a stored procedure in the database to handle your functionality and then call it from the code and pass in whatever data needs saved. DO NOT USE INLINE SQL. Paramterized stored procedures are the way to go.
2) Since I am not using structures so I need to create empty datasets to store temp information. Is this convenient ?
It depends on what you're doing. I would create an object model to retain my updated data and then I'd pass the properties into the stored procedure when it was time to save my changes.
3) If I have to search for a specific thing in dataset table, then do I have to loop thru all items or can I use sql on dataset or is there an alternate ?
You can loop the rows or you can use linq to pull what you need out. Linq is really nice as it's basically .NET coded queries against a collection.
There are plenty of tutorials/guides out there to show you how to update via stored proc call form your code. There are a ton of linq tutorials as well. Basically, a linq query against your table will look something like:
dim result as Generic.List(of String) =
(from r in table.AsEnumerable()
select r
where r["columnName"] = "the value").ToList()
This syntax is probably a bit off, but it looks something like that.
Edit
Your Model:
Public Class EmployeeModel
Public Property Id
Public Property FirstName
Public Property Last Name
Public Property JobCode
Public Sub EmployeeModel(your params)
//set properties
End Sub
End Class
Your DAL:
Public Shared Class EmployeeDAL
Public Shared Sub SaveEmployee(ByRef model as EmployeeModel)
// call your sp_SaveEmployee stored procedure and set the parameters to the model properties
// #id = EmpoyeeModel.Id
// #JobCode = Employee.JobCode
// ...
End Sub
End Class
I use VB every few months, so there are probably some small syntax errors in there. But that's basically all you need to do. The function to save your data is in the DAL, not in the class itself. If you don't want to use a DAL, you can put the save functionality in your class, though. It'll work the same way, it's just not as clearly separated.
On your Questions.
number 1: You have to connect to database in order to store and retrieve data. There are lots of ways on how to deal with it and one way of it is to use app.config or you may simply create a function that calls the connection every time you need it.
number 2: Since you are dealing with dataset here are some tips you might want to look at DataSet
number 3: You can also try using Data Adapter and Data Table. I am not sure what you meant by your question number 3.
Cheers
I have problem with the way you are using your database and the ram of your computer.
Problem1: since you already have a database for holding the movies information why are you again holding the same information in memory?, creating an extra overhead. if your answer is for performance or i have cheap memory then why don't you use xml or flatfile instead? Database is not needed with this senario.
Problem2: You are like a soldier who dosent know about the weapon he use? right? because you are asking silly questions your first question about opening connection.. the answer is yes you have to open the connection every time save/read the data and close it as soon as possible.
your second question about convinent the answer is no. instead create class with all field as property and some method for initialization,saving,deleting. this way you have to write less code. nad suppose you have a movie names xyz there can be another movie xyz how will you distinguish it? if you have whole information b4 you you can do it via release date ,casts etc, but still it will be hard, so create a primary key for both your table
and finally your 3rd question , it will be easier to use use sql queries than looping thru the dataset(get rid of the dataset as soon as possible)
wish yu good luck on the road to rdbms
i'm using oledb and DataSets on VB.net for filling an access database (.mdb).
It works in the following process:
i have an existing .mdb-file with Data in it
creating an oledb-dataadapter to the existing .mdb-file
filling a DataSet/DataTAble with the Data from the file (adapter.fill())
adding a new row to the dataset
filling the row with data
updating the dataset/datatable through the dataadapter to the .mdb file
This works so long, the problem is: I'm doing this process a few thousand times, with a few thousand datasets. From time to time, this is during longer and longer. I think it's because the dataadapter has to go through the whole database all the time and because i'm taking the whole dataset from the database all the time out, and updating it back to the database.
So my question:
is there an oppurtunity to do this in an other way? Without taking the whole data out from the database and taking it back? And without going through the whole Databse? Maybe with a sql-connection and then just adding a row to the end of the database??
Thanks for your help!
If You only adding rows - why not use SqlOleDBCommand? He has method .ExecuteScalar()
I am fairly new to visual basic and know only a little about LINQ and SQL. I know how to select items from an array with LINQ, but what I can't figure out how to do is access a database using an IQueryable. I connected the database to my project, added two classes from the database to the "LINQ to SQL" .dbml file and saved it. In my programming assignment, I am not supposed to create a data source (table thing) to display the data, but update my own interface manually. In order to do it, I was told to instantiate this:
Private dogs As System.Linq.IQueryable(Of Dog)
(Dog is a class that I added to my .dbml file from the database file)
Also, I am told it involves using a method
.AsEnumerable.ElementAt(index As Integer)
And somehow I am supposed to load database data from/using this. Help please if you can. I got screwed over by my professors as our online assignment program was down the whole thanksgiving break so I'm here doing this at the last minute. Thanks.
You have to make an instance of your data context class (which has the same name as your dbml + "DataContext"). Let's say it is AnimalsDataContext:
using (var context = new AnimalsDataContext())
{
var dogs = context.Dogs;
....
}
Currently, I would like provide this as an option to the user when storing data to the database.
Save the data to a file and use a background thread to read data from the textfile to SQL server.
Flow of my program:
- A stream of data coming from a server constantly (100 per second).
- want to store the data in a textfile and use background thread to copy data from the textfile back to the SQL database constantly as another user option.
Has this been done before?
Cheers.
Your question is indeed a bit confusing.
I'm guessing you mean that:
100 rows per second come from a certain source or server (eg. log entries)
One option for the user is textfile caching: the rows are stored in a textfile and periodically an incremental copy of the contents of the textfile into (an) SQL Server table(s) is performed.
Another option for the user is direct insert: the data is stored directly in the database as it comes in, with no textfile in between.
Am I right?
If yes, then you should do something in the lines of:
Create a trigger on an INSERT action to the table
In that trigger, check which user is inserting. If the user has textfile caching disabled, then the insert can go on. Otherwise, the data is redirected to a textfile (or a caching table)
Create a stored procedure that checks the caching table or text file for new data, copies the new data into the real table, and deletes the cached data.
Create an SQL Server Agent job that runs above stored procedure every minute, hour, day...
Since the interface from T-SQL to textfiles is not very flexible, I would recommend using a caching table instead. Why a textfile?
And for that matter, why cache the data before inserting it into the table? Perhaps we can suggest a better solution, if you explain the context of your question.