I have a table in the database.
id : Name : Category
--------------------
1 : jake : admin
I have used sqlbulkcopy for inserting new rows into the database table from an excel sheet containing two columns 'Name' and 'Category'.
Now, I have another requirement and that is updating the values of the column Name and Category from the same Excel sheet but with newly updated values. My question is, is this possible? I just need a way to update the values of the column. I went through this reference.
update SQl table from values in excel
but I don't know how to implement it.
Any help or suggestion will be greatly appreciate.
Thanks.
The code for uploading and importing excel file using sqlbulkcopy is below.
protected void Button1_Click(object sender, EventArgs e)
{
if (FileUpload1.HasFile)
{
FileUpload1.SaveAs(Server.MapPath("~/" + Path.GetFileName(FileUpload1.FileName)));
string file = Server.MapPath("~/"+Path.GetFileName(FileUpload1.FileName));
string constr = #"Provider=Microsoft.Jet.OLEDB.4.0;Excel 8.0; Extended Properties=HDR=Yes;IMEX=1; Data Source=" + file + ";";
using (OleDbConnection olecon = new OleDbConnection(constr))
{
OleDbCommand olecmd = new OleDbCommand("select Name, Category FROM [Sheet1$]", olecon);
olecon.Open();
using (DbDataReader dbrdr = olecmd.ExecuteReader())
{
string sqlcon = "Data Source=matty2011-PC\\SQLEXPRESS;Initial Catalog=mydb; Integrated Security=True";
using (SqlBulkCopy sqlbulkcopy = new SqlBulkCopy(sqlcon))
{
sqlbulkcopy.ColumnMappings.Add(0, 1);
sqlbulkcopy.ColumnMappings.Add(1, 2);
sqlbulkcopy.DestinationTableName = "exceldata";
sqlbulkcopy.WriteToServer(dbrdr);
}
}
}
}
else
{
Response.Write("Please select a file for upload.!"); return;
}
}
hi
if possible use SSIS to solve
Simply by DTS
note : SSIS DTS will append the existing data
Related
I have two excel files and I want to import these files to SQL temporary table.
First excel file:
T1 T2 T3 T4 Total
1,472 1,364 1,422 – 4,258
-152.6 -152.6 -152.6 –
1,958 1,939 1,942 –
-122.6 -123.7 -122.2 –
Second excel file:
T1 T2 T3 T4 T5 Total
1,472 1,364 1,422 – 12.2 4,258
-152.6 -152.6 -152.6 – 1000.12
1,958 1,939 1,942 – 50.23
-122.6 -123.7 -122.2 – 185.25
Is there any way in SSIS to identify the files on the basis of number of columns? I need to identify report on the basis of the column number.
Objects from the Microsoft.Office.Interop.Excel namespace can be used in a C# Script Task to do this as follows. This example outputs the file name and number of columns into an SSIS object variable (User::SSISObjectVariable") that can be used to apply further logic and processing in the package, such as storing in a database table or otherwise. The full file path is the first column in the object variable and the count of columns is the second. Also be sure to add a reference to the Microsoft.CSharp namespace in the script. The object variable will need to be included in the ReadWriteVariables field of the Script Task and if the source folder is stored in a variable (as done below) then add this variable in the ReadOnlyVariables field.
using Microsoft.Office.Interop.Excel;
using System.Data;
using System.IO;
using System.Collections.Generic;
List<string> excelFileList = new List<string>();
//get source directory
string filePath = Dts.Variables["User::FilePathVariable"].Value.ToString();
DirectoryInfo di = new DirectoryInfo(filePath);
System.Data.DataTable dt = new System.Data.DataTable();
dt.Columns.Add("FilePath", typeof(System.String));
dt.Columns.Add("ColumnCount", typeof(System.Int32));
foreach (FileInfo fi in di.EnumerateFiles())
{
//optional- check file extension and prefix
if (fi.Extension == ".xls" && fi.Name.StartsWith("Prefix"))
{
//get full file path
excelFileList.Add(fi.FullName);
}
}
foreach (string excelFile in excelFileList)
{
Microsoft.Office.Interop.Excel.Application xlApp = new Microsoft.Office.Interop.Excel.Application(); ;
Microsoft.Office.Interop.Excel.Workbook xlWorkbook = xlApp.Workbooks.Open(excelFile);
Microsoft.Office.Interop.Excel.Worksheet xlWorksheet = xlWorkbook.Sheets[1];
int columnCount;
//get number of columns
columnCount = xlWorksheet.Cells.Find("*", System.Reflection.Missing.Value,
System.Reflection.Missing.Value, System.Reflection.Missing.Value,
Microsoft.Office.Interop.Excel.XlSearchOrder.xlByColumns, Microsoft.Office.Interop.Excel.XlSearchDirection.xlPrevious,
false, System.Reflection.Missing.Value, System.Reflection.Missing.Value).Column;
//build data row to hold file path and column count
DataRow dr = dt.NewRow();
dr["FilePath"] = excelFile;
dr["ColumnCount"] = columnCount;
dt.Rows.Add(dr);
xlApp.Workbooks.Close();
xlApp.Quit();
xlWorkbook = null;
xlApp = null;
}
GC.Collect();
GC.WaitForPendingFinalizers();
//populate object variable
Dts.Variables["User::SSISObjectVariable"].Value = dt;
If you need to import excels with different schemas, you have two approaches:
(1) SSIS approach: Script Task + 2 Data flow tasks
In case that you only have two structures, then you can follow these steps:
Add a variable of type System.Int32 example: #[User::ColumnsCount]
Add a variable of type System.String to store the file path example: #[User::FilePath]
Add a script Task with and Select #[User::FilePath] as ReadOnly variable and #[User::ColumnsCount] as ReadWrite Variable
Inside the script Task write a similar script:
string FilePath = Dts.Variables["User::FilePath"].Value.toString();
string ExcelConnectionString = "Provider=Microsoft.ACE.OLEDB.12.0;
"Data Source='" + FilePath +
"';Extended Properties=\"Excel 12.0;HDR=YES;\"";
using (OleDbConnection OleDBCon = new OleDbConnection(ExcelConnectionString))
{
if (OleDBCon.State != ConnectionState.Open)
OleDBCon.Open();
using (OleDbCommand cmd = new OleDbCommand(strcommand, OleDBCon))
{
DataTable dtTable = new DataTable("Table1");
cmd.CommandType = CommandType.Text;
//replace Sheet1$ with the sheet name if it is different
cmd.CommandText = "SELECT * FROM Sheet1$"
using (OleDbDataAdapter daGetDataFromSheet = new OleDbDataAdapter(cmd))
{
daGetDataFromSheet.FillSchema(dtTable, SchemaType.Source);
Dts.Variables["User::ColumnsCount"].Value = dt.Columns.Count;
}
}
}
Add two Data Flow task, on for each Excel structure
Link the Script Task to each one of these Data Flow Task
Click on each Precedence constraint (link between tasks) and change the precendence type to Expression and Constraint and add the appropriate expression for each case:
5 columns:
#[User::ColumnsCount] == 5
6 columns:
#[User::ColumnsCount] == 6
Set the Delay Validation property to True for both Data Flow Tasks
TL DR: In case that you only have two structures, you can add two Data Flow Tasks (one for each structure), then you can use a Script Task to identify the columns count and execute the appropriate Data Flow Task based on the Columns Count (using precedence constraints expressions).
(2) C# approach: SchemaMapper class library
Recently i started a new project on Github, which is a class library developed using C#. You can use it to import tabular data from excel, word , powerpoint, text, csv, html, json and xml into SQL server table with a different schema definition using schema mapping approach. check it out at:
SchemaMapper: C# Schema mapping class library
You can follow this Wiki page for a step-by-step guide:
Import data from multiple files into one SQL table step by step guide
I have an access database for which I need to run a query that is available in postgres DBs, I was wondering if there is a possible that this can be accomplished:
Insert into Table (Col1,Col2...) values(Val1,Val2,...) returning * (ore even just an id defining that specific set of data that was just inserted)?
I'm using c# to communicate to the DB.Anything would help, thank you.
The code I basically use is the following :
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Data.OleDb;
namespace Testquery1
{
class Program
{
static void Main(string[] args)
{
string query = "INSERT INTO Table ( Val1, Val2, Val3 ) values(14,2,1)";
Test1 queryselect = new Test1();
queryselect.dataconnection(query);
}
}
class Test1
{
public OleDbConnection connection = new OleDbConnection();
string path = System.Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
string fileloc = #"DataBase.accdb";
string provider = #"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=";
public void dataconnection(string query)
{
connection.ConnectionString = provider + path + fileloc;
Console.WriteLine(connection.ConnectionString);
connection.Open();
OleDbCommand command = new OleDbCommand();
command.Connection = connection;
command.CommandText = query;
command.ExecuteNonQuery();
connection.Close();
}
}
}
Unfortunately with .net you cannot do a append or make table query between two different connections.
However, you CAN in Access.
If you have a working PostgreSQL SQL query that returns records, then you can simple save that query in Access (as a pass-through query.
You can now use that query in Access and send it to a NEW local table in Access. (Access supports this concept, .net does not)
You can thus either append or “make table” of that data.
And more interesting is you can even append between different connections in Access. So you can append from a PostgreSQL table to say a MySQL, or Oracle or FoxPro table inside of access.
Again, you can’t do this in .net.
So, assume a working raw SQL (PostgreSQL format) query that works in PostgreSQL? Take that SAME working query and save it in Access as a pass-through query.
Then in Access you can append to a table in Access (assuming same table structure with:
VBA (Access code)
Dim strSQL as string
strSQL = "INSERT INTO LocalTable SELECT * from QryPassR"
Currentdb.Execute strSQL
And if you want to MAKE a new table in Access with the SAME structure, so make table (not append), you can go:
Dim strSQL as string
strSQL = " SELECT * INTO LocalTable FROM qryPassR"
Currentdb.Execute strSQL
You can also in VBA code change the PostgreSQL to include criteria for that select.
(air code - does not take into account SQL injection issue).
Dim strCity as string.
strCity = inputbox("What city to pull from PostGres?")
dim strSQL as string
strSQL = "select * from tbleHotels where City = '" = strCity & "'"
With currentdb.QueryDefs("QryPassR"
.SQL = strSQL
End with
strSQL = "INSERT INTO LocalTable SELECT * from QryPassR"
Currentdb.Execute strSQL
‘ above will copy all the records from PostGreSQL of city = Edmonton into the Access table (called local table in this example).
And as noted, you not limited to “LocalTable” being a access table, it could be a FoxPro table, MySQL, SQL server etc. So you not limited to JUST using Access tables in the above with your given SQL. So any linked table you have in Access can be used – including ones that point to other database systems.
If you must use .net, then you have to:
Connect to first database.
Execute query to pull and fill a datatable.
Connect to second database.
Create (open) a data table based on second database.
Loop (iterate) each row from first connection datatable and copy the row into the second datatable (based on 2nd connection).
You have to do a row by row copy. (but there is ImportRow method of the .net datatable, so you don’t have to loop by a column by column copy.
(but you have to loop row by row).
In Access this looping is not required and in fact you can use SQL commands that operate on both tables, including as per above the make table and append, and you can even do relation joins between such tables - even those based on different connections (data sources).
Edit
Based on comments, it looks like the simple question is:
After I insert a row into Access, how can I get the last ID (identity) of that insert?
The follow vb.net code will do this:
Imports System.Data.OleDb
Dim MyCon As New OleDb.OleDbConnection("Provider=Microsoft.ACE.OLEDB.12.0;Data Source=C:\Test2\test44.accdb")
MyCon.Open()
Dim strSQL As String = "insert into tblHotels2 (City) VAlues('Edmonton')"
Dim cmd As New OleDb.OleDbCommand(strSQL, MyCon)
Dim r As Integer
cmd.ExecuteNonQuery()
cmd.CommandText = "select ##IDENTITY"
r = cmd.ExecuteScalar
Debug.Print(r)
Output = last PK id (autonumber)
While clicking on add button data is saved in database but after 2-3 times refresh data in database there 2-4 copies of same data is shown.
How to get to fix this?
String cs = ConfigurationManager.ConnectionStrings["MyDBConnectionString1"].ConnectionString;
using (SqlConnection con = new SqlConnection(cs))
{
SqlCommand cmd = new SqlCommand("Insert into tblBrands values('" + txtBrandName.Text + "')", con);
con.Open();
cmd.ExecuteNonQuery();
txtBrandName.Text = string.Empty;
}
If you are trying to solve in SQL (assuming from your tags) you could check before inserting using:
if not exists (select * from tblBrands where ...)
Build your where clause based on your criteria - what would you consider duplicate entry
More info on exists in Microsoft Docs
I have DataSet and TableAdapter for the table in that DataSet. The problem is that TableAdapter uses a stored procedure to fill DataTable, but the amount of returned columns can be different.
How can I change the schema of DataTable at runtime to get all the columns from stored procedure?
In my opinion, drop the idea of using the DataAdapter, it's completely unnecessary. Do this to get the DataTable instead:
using(SqlConnection con = new SqlConnection("Conectionstring"))
{
con.Open();
using (SqlCommand commnand= new SqlCommand("StoredProcName",con))
{
command.CommandType=CommandType.StoredProcedure;
SqlDataReader reader = command.ExecuteReader();
DataTable table = new DataTable();
table.Load(reader);
return table;
}
}
Now it doesn't matter the # of columns you return from the proc, the datatable will contain all of them.
EDIT: Add the following section to your Web.config file below the <configuration> section replacing the actual connectionString for yours. You should be able to copy this from the Your Settings file.
And now create your SqlConnection as follows:
SqlConnection con
= new SqlConnection(ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString))
Notice that "ConnectionString" is the name attribute of the connection string in the Web.Config.
See another example here, in case is not very clear.
Is it possible (in Vb.Net 2005), without manually parsing the dataset table properties, to create the table and add it to the database?
We have old versions of our program on some machines, which obviously has our old database, and we are looking for a way to detect if there is a missing table and then generate the table based on the current status of the table in the dataset. We were re-scripting the table every time we released a new version (if new columns were added) but we would like to avoid this step if possible.
See this MSDN Forum Post: Creating a new Table in SQL Server from ADO.net DataTable.
Here the poster seems to be trying to do the same thing as you, and provides code that generates a Create Table statement using the schema contained in a DataTable.
Assuming this works as it should, you could then take that code, and submit it to the database through SqlCommand.ExecuteNonQuery() in order to create your table.
Here is the code:
SqlConnection con = new SqlConnection("Data Source=.;uid=sa;pwd=sa123;database=Example1");
con.Open();
string sql = "Create Table abcd (";
foreach (DataColumn column in dt.Columns)
{
sql += "[" + column.ColumnName + "] " + "nvarchar(50)" + ",";
}
sql = sql.TrimEnd(new char[] { ',' }) + ")";
SqlCommand cmd = new SqlCommand(sql, con);
SqlDataAdapter da = new SqlDataAdapter(cmd);
cmd.ExecuteNonQuery();
using (var adapter = new SqlDataAdapter("SELECT * FROM abcd", con))
using(var builder = new SqlCommandBuilder(adapter))
{
adapter.InsertCommand = builder.GetInsertCommand();
adapter.Update(dt);
}
con.Close();
I hope you got the problem solved.
Here dt is the name of the DataTable.
Alternatively you can replace:
adapter.update(dt);
with
//if you have a DataSet
adapter.Update(ds.Tables[0]);