I created a local database and a windows form. In the windows form, in the server explorer, I added this database's connection. Now, how can I begin querying it using LINQ-to-SQL?
Have you set up a DataContext ? That is the typical object standing between your LINQ queries and the database connection.
Sample code from the link above:
// DataContext takes a connection string.
DataContext db = new DataContext(#"c:\Northwnd.mdf");
// Get a typed table to run queries.
Table<Customer> Customers = db.GetTable<Customer>();
// Query for customers from London.
var query =
from cust in Customers
where cust.City == "London"
select cust;
foreach (var cust in query)
Console.WriteLine("id = {0}, City = {1}", cust.CustomerID, cust.City);
Related
I have too many free tables from an old software and need to connect some of this tables to sql to tranfer the info and use in other software.
While your question is about how to do that using OpenRowSet, I would say forget about getting VFP data using OpenRowset or OpenQuery (which would be better of the two). Probably 32 bits or 64 bits import\export wizards wouldn't work either. All those were working successfully in good old times before windows 10 at least. I don't know what they have changed, now they all fail (I had tons of working OpenRowset, OpenQuery samples under windows 7).
You might try Sybase ADS driver with OpenRowset and OpenQuery.
Fortunately there are many workarounds. Some workarounds use intermediates like access and excel but IMHO they are overkill.
If you are using VFP itself, simplest would be to do this from within VFP, where it is a series of "auto generated on the fly" code executing against SQL Server.
One other option (which I favor) would be to use some C# code, that would do the transfer using "SqlBulkCopy" class. For a generic solution, you could:
Create a database on server,
Read a VFP table's Schema information,
Create compatible table on server (meaning you might want to change datatypes),
Create column mappings info,
Bulk load to table
Repeat the process in a loop for all tables.
Here is one sample that doesn't read the schema but instead creates a temp table on server with a known schema and sample mapping just for demo purposes (if you think DBF to SQL Server tools on the market are over 100$ not a bad starter):
void Main()
{
string sqlConnectionString = #"server=.\SQLExpress;Trusted_Connection=yes;Database=Test";
string path = #"C:\PROGRAM FILES (X86)\MICROSOFT VISUAL FOXPRO 9\SAMPLES\Northwind";
DataTable tbl = new DataTable(); // just to show the results - sadece sonuclari gostermek icin
using (OleDbConnection cn = new OleDbConnection("Provider=VFPOLEDB;Data Source="+path))
using (SqlConnection scn = new SqlConnection( sqlConnectionString ))
{
// Creatint a temp SQL Server table for sampling.
// If the table already existed then this part wouldn't exist.
// We would simply insert then.
// gecici bir SQL server tablosu yaratiyoruz ornek icin.
// tablo zaten var ise bu kisim olmayacak. Sadece insert edecektik.
SqlCommand createTemp = new SqlCommand();
createTemp.CommandText = #"create table ##SqlBulkSample
(
[CustomerId] char(6),
[Company] varchar(50),
[Contact] varchar(50),
[Country] varchar(20)
)";
createTemp.Connection = scn;
scn.Open();
createTemp.ExecuteNonQuery();
// Get the data from VFP and write to server using SqlBulkCopy
// Excelden veriyi al ve SqlBulkCopy ile servera yaz
OleDbCommand cmd = new OleDbCommand("select CustomerId, CompanyName, ContactName, Country from Customers", cn);
SqlBulkCopy sbc = new SqlBulkCopy(scn, SqlBulkCopyOptions.TableLock,null);
// For demonstration purposes of column mapping,
// we have different count of fields with different field names and order.
// Without mapping, it would be a copy of the same structured data
// Column mapping'i orneklemek icin farkli sayi, isim ve sirada alanlarimiz var.
// Mapping olmasa idi ayni yapidaki veri kopyalaniyor olacakti.
sbc.ColumnMappings.Add(0,"[CustomerId]");
sbc.ColumnMappings.Add(1,"[Company]");
sbc.ColumnMappings.Add(2,"[Contact]");
sbc.ColumnMappings.Add(3,"[Country]");
cn.Open();
OleDbDataReader rdr = cmd.ExecuteReader();
//SqlBulkCopy properties
//With defaults or high values we wouldn't see any notification so
// for demoing purposes setting them to be extremely low
// SqlBulkCopy'nin propertyleri
// Varsayilan veya yuksek degerlerle hic geri bildirim
// almayacaktik, o nedenle bu degerleri oldukca kucuk
// degerlere kuruyoruz.
sbc.NotifyAfter = 20;
sbc.BatchSize = 10;
//sbc.BulkCopyTimeout = 10000;
sbc.DestinationTableName = "##SqlBulkSample";
// Notify in between
// Arada notification
sbc.SqlRowsCopied += (sender,e) =>
{
Console.WriteLine("-- Copied {0} rows to {1}.",
e.RowsCopied,
((SqlBulkCopy)sender).DestinationTableName);
};
// Write to server
// server'a yaz
sbc.WriteToServer(rdr);
if (!rdr.IsClosed) { rdr.Close(); }
cn.Close();
// Check that it is really written to server.
// Just for testing the sample.
// Server'a hakikaten yazildigini kontrol ediyoruz.
// Bu sadece ornekte test icin.
SqlCommand cmdRead = new SqlCommand("select * from ##SqlBulkSample", scn);
tbl.Load(cmdRead.ExecuteReader());
scn.Close();
}
// Show the data read from SQL server
// Serverdan okunanlari bir formda goster.
Form f = new Form();
DataGridView dgv = new DataGridView();
dgv.Location = new Point(0, 0);
dgv.Dock = DockStyle.Fill;
dgv.DataSource = tbl;
f.Controls.Add(dgv);
f.ClientSize = new Size(1024, 768);
f.ShowDialog();
}
And also, your software might directly use the VFP data without getting it into MS SQL Server (I don't think you would want to do that but anyway).
HTH
Say you have your tables stores in an SQL server DB, and you want to perform multi table actions, i.e. join several tables from that same database.
Following code can interact and receive data from SQL server:
library(dplyr)
library(odbc)
con <- dbConnect(odbc::odbc(),
.connection_string = "Driver={SQL Server};Server=.;Database=My_DB;")
Table1 <- tbl(con, "Table1")
Table1 # View glimpse of Table1
Table2 <- tbl(con, "Table2")
Table2 # View glimpse of Table2
Table3 <- tbl(con, "Table3")
However, with a few results retrieved with the same connection, eventually following error occurs:
Error: [Microsoft][ODBC SQL Server Driver]Connection is busy with results for another hstmt
My current googling skills have taking me to the answer that the backend does not support multiple active result sets (MARS) - I guess more than 2 active result sets is the maximum? (backend is DBI and odbc)
So, my question is: what is best practice if I want to collect data from several tables from an SQL DB?
Open a connection for each table?
Actively close the connection and open it again for the next table?
Does the backend support MARS to be parsed to the connection string?
To make a connection that can hold multiple result sets, I've had luck with following connection code:
con <- DBI::dbConnect(odbc::odbc(),
Driver = "SQL Server Native Client 11.0",
Server = "my_host",
UID = rstudioapi::askForPassword("Database UID"),
PWD = rstudioapi::askForPassword("Database PWD"),
Port = 1433,
MultipleActiveResultSets = "True",
Database = my_db)
On top of that, I found that the new pool-package can do the job:
pool <- dbPool(odbc::odbc(),
Driver = "SQL Server Native Client 11.0",
Server = "my_host",
UID = rstudioapi::askForPassword("Database UID"),
PWD = rstudioapi::askForPassword("Database PWD"),
Port = 1433,
MultipleActiveResultSets = "True",
Database = my_db)
It is quicker and more stable than the DBI connection, however, one minor drawback is that the database doesn't pop up in the connection tab for easy reference.
For both methods, remember to close the connection/pool when done. For the DBI-method its:
dbDisconnect(con)
Whereas the pool-method is closed by calling:
poolClose(pool)
I'm creating a simple Master-Detail relationship with ClientDataSets on Delphi XE3 + SqlServer. I have configured Master-Detail through DatasetField at the client application and with the property MasterSource in the Detail TUniQuery at the server application. I'm using one DataSetProvider and one DataSource.
Server Application
**Master table**
object qyREMISION_COMPRA: TUniQuery
Connection = DMBase.BD
SQL.Strings = (SELECT R.ID_REMISION_COMPRA, R.FECHA, R.FACTURA
FROM REMISION_COMPRA R
WHERE R.ID_REMISION_COMPRA =:ID_REMISION_COMPRA)
object qyREMISION_COMPRAID_REMISION_COMPRA: TIntegerField
AutoGenerateValue = arAutoInc
FieldName = ID_REMISION_COMPRA
end
**Detail table**
object qyREMISION_COMPRA_PRODUCTO: TUniQuery
Connection = DMBase.BD
SQL.Strings = (SELECT RP.ID_REMISION_COMPRA_PRODUCTO, RP.ID_REMISION_COMPRA, RP.ID_PRODUCTO
FROM REMISION_COMPRA_PRODUCTO RP
WHERE RP.ID_REMISION_COMPRA=:ID_REMISION_COMPRA
ORDER BY RP.ID_REMISION_COMPRA_PRODUCTO)
SQLUpdate.Strings = (UPDATE REMISION_COMPRA_PRODUCTO
SET ID_REMISION_COMPRA = :ID_REMISION_COMPRA, ID_PRODUCTO = :ID_PRODUCTO
WHERE ID_REMISION_COMPRA_PRODUCTO = :Old_ID_REMISION_COMPRA_PRODUCTO)
MasterSource = datasetREMISION_COMPRA
MasterFields = ID_REMISION_COMPRA
DetailFields = ID_REMISION_COMPRA
**DataSetProvider**
object dspREMISION_COMPRA: TDataSetProvider
DataSet = qyREMISION_COMPRA
Options = [poCascadeDeletes, poCascadeUpdates, poPropogateChanges, poUseQuoteChar] end
Client Application
**Master ClientDataSet**
object cdsREMISION_COMPRA: TClientDataSet
ProviderName = 'dspREMISION_COMPRA'
RemoteServer = dmProvs.dspCompra
object cdsREMISION_COMPRAqyREMISION_COMPRA_PRODUCTO: TDataSetField
FieldName = 'qyREMISION_COMPRA_PRODUCTO'
end
**Detail ClientDataSet**
object cdsREMISION_COMPRA_PRODUCTO: TClientDataSet
DataSetField = cdsREMISION_COMPRAqyREMISION_COMPRA_PRODUCTO
To save the changes to the database, I only do for the master clientdataset cdsREMISION_COMPRA.ApplyUpdates(0)
When I do an insert works perfectly, but when I do an update I have problems with triggers in the database because the aplication execute the detail first and then the update of the master table. This is normal? I'm doing something wrong?
I want to copy a database to a sql server to another, but i just want to copy structure (views, stored procedures, tables, fields, indexes, etc), no rows.
I tried to generate a script from sql server management but the script is very verbose (task menu > create as)
Follow Below steps for generate script :
Right Click on Database
Select task
Select Generate Script from Task
Follow the steps
Finally finish for complete this process
You can either use the SQL Server Management Object API (see task "creating, altering and removing databases"):
C# Code for generate sql script :
public string GenerateScript()
{
var sb = new StringBuilder();
var srv= new Server(#"Your Database Server Name");
var db= server.Databases["Your Database name"];
var scrpt = new Scripter(srv);
scrpt.Options.ScriptDrops = false;
var obj= new Urn[1];
foreach (Table tbl in db.Tables)
{
obj[0] = tbl.Urn;
if (tbl.IsSystemObject == false)
{
StringCollection sc = scripter.Script(obj);
foreach (var st in sc)
{
sb.Append(st);
}
}
}
return sb.ToString();
}
You case use Copy database wizard
Some limitations of the it are :
1.The Copy Database Wizard is not available in the Express edition.
1.The Copy Database Wizard cannot be used to copy or move databases that:
Are System.
Are marked for replication.
Are marked Inaccessible, Loading, Offline, Recovering, Suspect, or in Emergency Mode.
Have data or log files stored in Microsoft Azure storage.
I'm new to nhibernate (but used hibernate for java before).
I built a session factory for our sql server database (sql server enterprise edition 8)
ISessionFactory factory2 = Fluently.Configure()
.Database(MsSqlConfiguration.MsSql2008
.ConnectionString(#"user id=xx; password=xxx;server=xxx;initial catalog=xxx")
.ShowSql()
)
.Mappings(m => m.FluentMappings
.AddFromAssemblyOf<Program>())
.ExposeConfiguration(cfg => new SchemaValidator(cfg).Validate())
.BuildSessionFactory();
So I use the ShowSql() method to log the queries to the console.
In my programm I load / create two objects and want to persist them and then do a update on a column:
using (var session = sf.OpenSession())
{
session.FlushMode = FlushMode.Always;
using (var ta = session.BeginTransaction())
{
Console.ReadKey();
PMA pm = session.CreateCriteria<PMA>()
.Add(Restrictions.Eq("Name", "HANSER")).List<PMA>().FirstOrDefault();
if (pm == null)
{
pm = new PMA();
pm.Prio = "1";
pm.Name = "HANSER";
pm.Datum = DateTime.Now;
session.Save(pm);
}
Clip clip = new Clip();
clip.PMA = pm;
clip.sys_created = DateTime.Now;
clip.sys_name = "system name";
clip.Title = "Test";
session.Save(clip);
Console.ReadKey();
clip.Title = "PETERSEN";
session.SaveOrUpdate(clip);
session.Transaction.Commit();
session.Flush();
session.Dispose();
Console.ReadKey();
}
}
The first insert for the pm object will be logged on the console, but the other insert and the update for the clip object don't appear in the console. When I look in the database, I see there is everything right, everything will be inserted and updated. But I want to see the query. I try to set flush mode to always and make a session.Flush() to the session at the end and then a session.Dispose(), but nothing changes.
When I use postgres (only change the sessionfactory), I see all query logs.
How can I let nhibernate log all queries for sql server ?
When using ADO.NET batching (on by default in SQL Server, which supports it), DML queries are not logged to the console.