Update or insert from csv into sql server - sql-server

I have a CSV file containing user info, say
first_name;last_name;user_id;full_name
column separator is ;, row terminator is \n.
What I need to do is to insert or update into users table. Unique key is user_id: if record with this user_id already exists, I need to update, if it doesn't I need to insert.
However, there are some problems that prevent me from using management studio data-import or bulk insert.
First, there are more fields in the users table (not just 4) and the order of columns in csv file does not correspond to the order of columns in the table. So I need to be able to specify which column from the file goes to which column in the table.
Secondly, some additional fields need to be filled. For example, users.email = users.user_id. Here is another obstacle - though for a newly inserted row users.email = users.user_id it is possible that users.email will change in the future, so I cannot just insert user_id and then run update [users] set [email] = [user_id].

Use fgetcsv
Example with two colonne :
<?php
$row = 0;
$update = "";
//separator of column
$separator = ";";
//creates two variables containing the index columns to read / modify
$idx_nom = 0;
$idx_rubrique = 1;
//Open file writing
if (($handle = fopen("test.csv", "r")) !== FALSE)
{
//we travel the file line by line, storing the data in a table
while (($data = fgetcsv($handle, 1000, ";")) !== FALSE)
{
//can not control that the second line if the first contains the column headers
if ($row != 0)
{
//conditions of one column
if (stristr($data[$idx_nom], 'chemise'))
{
$data[$idx_rubrique] = 1;
}
else if (stristr($data[$idx_nom], 'costume'))
{
$data[$idx_rubrique] = 2;
}
else if (stristr($data[$idx_nom], 'cravate'))
{
$data[$idx_rubrique] = 3;
}
}
$update .= implode($separator,$data)."\r\n";
$row++;
}
fclose($handle);
}
//update from csv
$ouvre=fopen("test.csv","w+");
fwrite($ouvre,$update);
fclose($ouvre);
?>

Related

PHPSpreadsheet to update an existing file writes only the last record of the query

Hello i am trying to write an existing xlsx file using phpspreadsheet with setActiveSheetIndexByName(sheetname) and setcellvalue with reference and value, but it updates only the last record. spent more than 12 hours on this.
i tried foreach instead of while and used a counter to increment, but none worked.
<?php
include_once('db.php');
$prospect = $_REQUEST['prospect'];
require 'vendor/autoload.php';
use PhpOffice\PhpSpreadsheet\IOFactory;
use PhpOffice\PhpSpreadsheet\Spreadsheet;
use PhpOffice\PhpSpreadsheet\Writer\Xlsx;
$sql1 = mysqli_query($db,"select filename,sheetname, row, responsecol,compliancecol,response, compliance from spreadsheet where `prospect`='$prospect' and response <>'' order by row");
//$row=1;
while($row1 = mysqli_fetch_assoc($sql1))
{
$filename= $row1['filename']; //test.xlsx
$sheetname= $row1['sheetname']; // mysheet
$responsecol= $row1['responsecol'].$row1['row']; //D1
$response= $row1['response']; //response
$compliancecol= $row1['compliancecol'].$row1['row']; //C1
$compliance= $row1['compliance']; //compliance
$spreadsheet = \PhpOffice\PhpSpreadsheet\IOFactory::load($filename);
$spreadsheet->setActiveSheetIndexByName($sheetname)
->setCellValue($compliancecol,$compliance)
->setCellValue($responsecol,$response);
//$row++;
}
$writer = IOFactory::createWriter($spreadsheet, 'Xlsx');
$writer->save("newfile.xlsx");
exit;
?>
i wish each of the row from mysqli result updates each reference cell with value.
The easys way is to set a Limit of 1 to your MySQL query. That takes only one value from your data. If you will the last you should sort DESC.
$sql1 = mysqli_query($db,"select filename,sheetname, row, responsecol,compliancecol,response, compliance from spreadsheet where `prospect`='$prospect' and response <>'' order by row DESC LIMIT 1");

Csv file to a Lua table and access the lines as new table or function()

Currently my code have simple tables containing the data needed for each object like this:
infantry = {class = "army", type = "human", power = 2}
cavalry = {class = "panzer", type = "motorized", power = 12}
battleship = {class = "navy", type = "motorized", power = 256}
I use the tables names as identifiers in various functions to have their values processed one by one as a function that is simply called to have access to the values.
Now I want to have this data stored in a spreadsheet (csv file) instead that looks something like this:
Name class type power
Infantry army human 2
Cavalry panzer motorized 12
Battleship navy motorized 256
The spreadsheet will not have more than 50 lines and I want to be able to increase columns in the future.
Tried a couple approaches from similar situation I found here but due to lacking skills I failed to access any values from the nested table. I think this is because I don't fully understand how the tables structure are after reading each line from the csv file to the table and therefore fail to print any values at all.
If there is a way to get the name,class,type,power from the table and use that line just as my old simple tables, I would appreciate having a educational example presented. Another approach could be to declare new tables from the csv that behaves exactly like my old simple tables, line by line from the csv file. I don't know if this is doable.
Using Lua 5.1
You can read the csv file in as a string . i will use a multi line string here to represent the csv.
gmatch with pattern [^\n]+ will return each row of the csv.
gmatch with pattern [^,]+ will return the value of each column from our given row.
if more rows or columns are added or if the columns are moved around we will still reliably convert then information as long as the first row has the header information.
The only column that can not move is the first one the Name column if that is moved it will change the key used to store the row in to the table.
Using gmatch and 2 patterns, [^,]+ and [^\n]+, you can separate the string into each row and column of the csv. Comments in the following code:
local csv = [[
Name,class,type,power
Infantry,army,human,2
Cavalry,panzer,motorized,12
Battleship,navy,motorized,256
]]
local items = {} -- Store our values here
local headers = {} --
local first = true
for line in csv:gmatch("[^\n]+") do
if first then -- this is to handle the first line and capture our headers.
local count = 1
for header in line:gmatch("[^,]+") do
headers[count] = header
count = count + 1
end
first = false -- set first to false to switch off the header block
else
local name
local i = 2 -- We start at 2 because we wont be increment for the header
for field in line:gmatch("[^,]+") do
name = name or field -- check if we know the name of our row
if items[name] then -- if the name is already in the items table then this is a field
items[name][headers[i]] = field -- assign our value at the header in the table with the given name.
i = i + 1
else -- if the name is not in the table we create a new index for it
items[name] = {}
end
end
end
end
Here is how you can load a csv using the I/O library:
-- Example of how to load the csv.
path = "some\\path\\to\\file.csv"
local f = assert(io.open(path))
local csv = f:read("*all")
f:close()
Alternative you can use io.lines(path) which would take the place of csv:gmatch("[^\n]+") in the for loop sections as well.
Here is an example of using the resulting table:
-- print table out
print("items = {")
for name, item in pairs(items) do
print(" " .. name .. " = { ")
for field, value in pairs(item) do
print(" " .. field .. " = ".. value .. ",")
end
print(" },")
end
print("}")
The output:
items = {
Infantry = {
type = human,
class = army,
power = 2,
},
Battleship = {
type = motorized,
class = navy,
power = 256,
},
Cavalry = {
type = motorized,
class = panzer,
power = 12,
},
}

Combine 2 rows of CSV file in SSIS

I have one CSV file where the information is spread on two lines
Line 1 contains Name and age
Line 2 contains detail like address, city, salary, occupation
I want to combine 2 rows to insert it in a database.
CSV file :
Raju, 42
12345 west andheri,Mumbai, 100000, service
In SQL Server I can do by using cursor. But I have to do in SSIS.
For a similar case, i will read each line as one column and use a script component to fix the structure. You can follow my answer on the following question. It contains a step-by-step guide:
SSIS reading LF as terminator when its set as CRLF
I like using a script component in order to be able to store data from a different row in this case.
Read the file as a single column CSV into Column1.
Add script component and add a new Output called CorrectedOutput and define all columns from both rows. Also, mark Column1 as read.
Create 2 variables outside of row processing to 'hold' first row
string name = string.Empty;
string Age = string.Empty;
Use a split to determine line 1 or line 2
string[] str = Row.Column1.Split(',');
Use an if to determine row 1 or 2
if(str.Length == 2)
{
name = str[0];
age=str[1];}
else
{
CorrectedOutputBuffer.AddRow();
CorrectedOutputBuffer.Name = name; //This uses the stored value from prior row
CorrectedOutputBuffer.Age = age; //This uses the stored value from prior row
CorrectedOutputBuffer.Address = str[0];
CorrectedOutputBuffer.City = str[1];
CorrectedOutputBuffer.Salary = str[2];
CorrectedOutputBuffer.Occupation = str[3];
}
The overall effect is this...
On Row 1, you just hold the data in variables
On Row 2, you write out the data to 1 new row.

Importing Large XML file into SQL 2.5Gb

Hi I am trying to import a large XML file into a table on my sql server (2014)
I have used the code below for smaller files and thought it would be ok as this is a once off, I kicked it off yesterday and the query was still running when I came into work today so this is obviously the wrong route.
here is the code.
CREATE TABLE files_index_bulk
(
Id INT IDENTITY PRIMARY KEY,
XMLData XML,
LoadedDateTime DATETIME
)
INSERT INTO files_index_bulk(XMLData, LoadedDateTime)
SELECT CONVERT(XML, BulkColumn, 2) AS BulkColumn, GETDATE()
FROM OPENROWSET(BULK 'c:\scripts\icecat\files.index.xml', SINGLE_BLOB) AS x;
SELECT * FROM files_index_bulk
Can anyone point out another way of doing this please ive looked around at importing large files and it keeps coming back to using bulk. which I already am.
thanks in advance.
here is the table I am using I want to pull all the data into.
USE [ICECATtesting]
GO
/****** Object: Table [dbo].[files_index] Script Date: 28/04/2017 20:10:44
******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[files_index](
[Product_ID] [int] NULL,
[path] [varchar](100) NULL,
[Updated] [varchar](50) NULL,
[Quality] [varchar](50) NULL,
[Supplier_id] [int] NULL,
[Prod_ID] [varchar](1) NULL,
[Catid] [int] NULL,
[On_Market] [int] NULL,
[Model_Name] [varchar](250) NULL,
[Product_View] [int] NULL,
[HighPic] [varchar](1) NULL,
[HighPicSize] [int] NULL,
[HighPicWidth] [int] NULL,
[HighPicHeight] [int] NULL,
[Date_Added] [varchar](150) NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
and here is a snippit of the xml file.
<ICECAT-interface xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://data.icecat.biz/xsd/files.index.xsd">
<files.index Generated="20170427010009">
<file path="export/level4/EN/11.xml" Product_ID="11" Updated="20170329110432" Quality="SUPPLIER" Supplier_id="2" Prod_ID="PS300E-03YNL-DU" Catid="151" On_Market="0" Model_Name="Satellite 3000-400" Product_View="587591" HighPic="" HighPicSize="0" HighPicWidth="0" HighPicHeight="0" Date_Added="20050627000000">
</file>
<file path="export/level4/EN/12.xml" Product_ID="12" Updated="20170329110432" Quality="ICECAT" Supplier_id="7" Prod_ID="91.42R01.32H" Catid="151" On_Market="0" Model_Name="TravelMate 740LF" Product_View="40042" HighPic="http://images.icecat.biz/img/norm/high/12-31699.jpg" HighPicSize="19384" HighPicWidth="170" HighPicHeight="192" Date_Added="20050627000000">
</file>
<file path="export/level4/EN/13.xml" Product_ID="13" Updated="20170329110432" Quality="SUPPLIER" Supplier_id="2" Prod_ID="PP722E-H390W-NL" Catid="151" On_Market="0" Model_Name="Portégé 7220CT / NW2" Product_View="37021" HighPic="http://images.icecat.biz/img/norm/high/13-31699.jpg" HighPicSize="27152" HighPicWidth="280" HighPicHeight="280" Date_Added="20050627000000">
</file>
The max size of an XML column value in SQL Server is 2GB. It will not be possible to import a 2.5GB file into a single XML column.
UPDATE
Since your underlying objective is to transform XML elements within the file into table rows, you don't need to stage the entire file contents into a single XML column. You can avoid the 2GB limitation, reduce memory requirements, and improve performance by shredding the XML in client code and using a bulk insert technique to insert batches of multiple rows.
The example Powershell script below uses an XmlTextReader to avoid reading the entire XML into a DOM and uses SqlBulkCopy to insert batches of many rows at once. The combination of these techniques should allow you to insert millions rows in minutes rather than hours. These same techniques can be implemented in a custom app or SSIS script task.
I noticed a couple of the table columns specify varchar(1) yet the XML attribute values contain many characters. You'll need to either expand length of the columns or transform the source values.
[String]$global:connectionString = "Data Source=YourServer;Initial Catalog=YourDatabase;Integrated Security=SSPI";
[System.Data.DataTable]$global:dt = New-Object System.Data.DataTable;
[System.Xml.XmlTextReader]$global:xmlReader = New-Object System.Xml.XmlTextReader("C:\FilesToImport\files.xml");
[Int32]$global:batchSize = 10000;
Function Add-FileRow() {
$newRow = $dt.NewRow();
$null = $dt.Rows.Add($newRow);
$newRow["Product_ID"] = $global:xmlReader.GetAttribute("Product_ID");
$newRow["path"] = $global:xmlReader.GetAttribute("path");
$newRow["Updated"] = $global:xmlReader.GetAttribute("Updated");
$newRow["Quality"] = $global:xmlReader.GetAttribute("Quality");
$newRow["Supplier_id"] = $global:xmlReader.GetAttribute("Supplier_id");
$newRow["Prod_ID"] = $global:xmlReader.GetAttribute("Prod_ID");
$newRow["Catid"] = $global:xmlReader.GetAttribute("Catid");
$newRow["On_Market"] = $global:xmlReader.GetAttribute("On_Market");
$newRow["Model_Name"] = $global:xmlReader.GetAttribute("Model_Name");
$newRow["Product_View"] = $global:xmlReader.GetAttribute("Product_View");
$newRow["HighPic"] = $global:xmlReader.GetAttribute("HighPic");
$newRow["HighPicSize"] = $global:xmlReader.GetAttribute("HighPicSize");
$newRow["HighPicWidth"] = $global:xmlReader.GetAttribute("HighPicWidth");
$newRow["HighPicHeight"] = $global:xmlReader.GetAttribute("HighPicHeight");
$newRow["Date_Added"] = $global:xmlReader.GetAttribute("Date_Added");
}
try
{
# init data table schema
$da = New-Object System.Data.SqlClient.SqlDataAdapter("SELECT * FROM dbo.files_index WHERE 0 = 1;", $global:connectionString);
$null = $da.Fill($global:dt);
$bcp = New-Object System.Data.SqlClient.SqlBulkCopy($global:connectionString);
$bcp.DestinationTableName = "dbo.files_index";
$recordCount = 0;
while($xmlReader.Read() -eq $true)
{
if(($xmlReader.NodeType -eq [System.Xml.XmlNodeType]::Element) -and ($xmlReader.Name -eq "file"))
{
Add-FileRow -xmlReader $xmlReader;
$recordCount += 1;
if(($recordCount % $global:batchSize) -eq 0)
{
$bcp.WriteToServer($dt);
$dt.Rows.Clear();
Write-Host "$recordCount file elements processed so far";
}
}
}
if($dt.Rows.Count -gt 0)
{
$bcp.WriteToServer($dt);
}
$bcp.Close();
$xmlReader.Close();
Write-Host "$recordCount file elements imported";
}
catch
{
throw;
}
Try this. Just another method that I have used for some time. It's pretty fast (could be faster). I pull a huge xml db from a gaming company every night. This is how i get it an import it.
$xml = new XMLReader();
$xml->open($xml_file); // file is your xml file you want to parse
while($xml->read() && $xml->name != 'game') { ; } // get past the header to your first record (game in my case)
while($xml->name == 'game') { // now while we are in this record
$element = new SimpleXMLElement($xml->readOuterXML());
$gameRec = $this->createGameRecord($element, $os); // this is my function to reduce some clutter - and I use it elsewhere too
/* this looks confusing, but it is not. There are over 20 fields, and instead of typing them all out, I just made a string. */
$sql = "INSERT INTO $table (";
foreach($gameRec as $field=>$game){
$sql .= " $field,";
}
$sql = rtrim($sql, ",");
$sql .=") values (";
foreach($gameRec as $field=>$game) {
$sql .= " :$field,";
}
$sql = rtrim($sql,",");
$sql .= ") ON DUPLICATE KEY UPDATE "; // online game doesn't have a gamerank - not my choice LOL, so I adjust that for here
switch ($os) {
case 'pc' : $sql .= "gamerank = ".$gameRec['gamerank'] ; break;
case 'mac': $sql .= "gamerank = ".$gameRec['gamerank'] ; break;
case 'pl' : $sql .= "playercount = ".$gameRec['playercount'] ; break;
case 'og' :
$playercount = $this->getPlayerCount($gameRec['gameid']);
$sql .= "playercount = ".$playercount['playercount'] ;
break;
}
try {
$stmt = $this->connect()->prepare($sql);
$stmt->execute($gameRec);
} catch (PDOException $e) {// Kludge
echo 'os: '.$os.'<br/>table: '.$table.'<br/>XML LINK: '.$comprehensive_xml.'<br/>Current Record:<br/><pre>'.print_r($gameRec).'</pre><br/>'.
'SQL: '.$sql.'<br/>';
die('Line:33<br/>Function: pullBFG()<BR/>Cannot add game record <br/>'.$e->getMessage());
}
/// VERY VERY VERY IMPORTANT do not forget these 2 lines, or it will go into a endless loop - I know, I've done it. locks up your system after a bit hahaah
$xml->next('game');
unset($element);
}// while there are games
This should get you started. Obviously, adjust the "game" to your xml records. Trim out the fat I have here.
Here is the createGameRecord($element, $type='pc')
Basically it turns it into an array to use elsewhere, and makes it easier to add it to the db. with a single line as seen above: $stmt->execute($gameRec); Where $gameRec was returned from this function. PDO knows gameRec is an array, and will parse it out as you INSERT IT. the "delHardReturns() is another of my fucntion that gets rid of those hard returns /r /n etc.. Seems to mess up the SQL. I think SQL has a function for that, but I have not pursed it.
Hope you find this useful.
private function createGameRecord($element, $type='pc') {
if( ($type == 'pc') || ($type == 'og') ) { // player count is handled separately
$game = array(
'gamename' => strval($element->gamename),
'gameid' => strval($element->gameid),
'genreid' => strval($element->genreid),
'allgenreid' => strval($element->allgenreid),
'shortdesc' => $this->delHardReturns(strval($element->shortdesc)),
'meddesc' => $this->delHardReturns(strval($element->meddesc)),
'bullet1' => $this->delHardReturns(strval($element->bullet1)),
'bullet2' => $this->delHardReturns(strval($element->bullet2)),
'bullet3' => $this->delHardReturns(strval($element->bullet3)),
'bullet4' => $this->delHardReturns(strval($element->bullet4)),
'bullet5' => $this->delHardReturns(strval($element->bullet5)),
'longdesc' => $this->delHardReturns(strval($element->longdesc)),
'foldername' => strval($element->foldername),
'hasdownload' => strval($element->hasdownload),
'hasdwfeature' => strval($element->hasdwfeature),
'releasedate' => strval($element->releasedate)
);
if($type === 'pc') {
$game['hasvideo'] = strval($element->hasvideo);
$game['hasflash'] = strval($element->hasflash);
$game['price'] = strval($element->price);
$game['gamerank'] = strval($element->gamerank);
$game['gamesize'] = strval($element->gamesize);
$game['macgameid'] = strval($element->macgameid);
$game['family'] = strval($element->family);
$game['familyid'] = strval($element->familyid);
$game['productid'] = strval($element->productid);
$game['pc_sysreqos'] = strval($element->systemreq->pc->sysreqos);
$game['pc_sysreqmhz'] = strval($element->systemreq->pc->sysreqmhz);
$game['pc_sysreqmem'] = strval($element->systemreq->pc->sysreqmem);
$game['pc_sysreqhd'] = strval($element->systemreq->pc->sysreqhd);
if(empty($game['gamerank'])) $game['gamerank'] = 99999;
$game['gamesize'] = $this->readableBytes((int)$game['gamesize']);
}// dealing with PC type
if($type === 'og') {
$game['onlineiframeheight'] = strval($element->onlineiframeheight);
$game['onlineiframewidth'] = strval($element->onlineiframewidth);
}
$game['releasedate'] = substr($game['releasedate'],0,10);
} else {// not type = pl
$game['playercount'] = strval($element->playercount);
$game['gameid'] = strval($element->gameid);
}// no type = pl else
return $game;
}/
Updated: Much faster. I did some research, and while the above post I made shows one (slow) method, I was able to find one that works even faster - for me it does.
I put this as a new answer due to the complete difference from my previous post.
LOAD XML LOCAL INFILE 'path/to/file.xml' INTO TABLE tablename ROWS IDENTIFIED BY '<xml-identifier>'
Example
<students>
<student>
<name>john doe</name>
<boringfields>bla bla bla......</boringfields>
</student>
</students>
Then, MYSQL command would be:
LOAD XML LOCAL INFILE 'path/to/students.xml' INTO TABLE tablename ROWS IDENTIFIED BY '<student>'
rows identified must have single quote and angle brackets.
when I switched to this method, I went from 12min +/- to 30 seconds!! +/-
tips that worked for me. was use the
DELETE FROM tablename
otherwise it will just append to your db.
Ref: https://dev.mysql.com/doc/refman/5.5/en/load-xml.html

Add value into database plus 1

In my database I have:
Row ID - Driver ID - Log ID.
Row ID is unique and auto-increments. What I want is for the Log ID to be unique for each row that has that Driver ID.
For example say a row is inserted with Driver ID 1 I want that row to have a Log ID of 1 but the next time a row is inserted with Driver ID 1 I want it to have a Log ID of 2.
How can I achieve this?
By way for database i am using PHPMyAdmin.
----------------Edit----------------------
This is what i have in my PHP now, but it says:
On the webpage: Incorrect integer value: '' for column 'FinesCost' at row 1
And i dump the variables and get this: string(2) "16" string(2) "16" string(2) "16" so i dont understand why it is saying incorrect integer value and why it is saying they are undefines because they are very clearly defined.
In the PHP error log: [19-Jul-2013 10:44:18 Europe/Minsk] PHP Notice: Undefined variable: FinesCostP‌ost2 in C:\inetpub\wwwroot\hosting\Dan\JWT\drivers-log-send.php on line 336
[19-Jul-2013 10:44:18 Europe/Minsk] PHP Notice: Undefined variable: TravelExpensesPo‌​st2 in C:\inetpub\wwwroot\hosting\Dan\JWT\drivers-log-send.php on line 336
///PHP TO INSERT DRIVER'S BANK DETAILS INTO BANK DATABASE
session_start();
$host=""; // Host name
$username=""; // Mysql username
$password=""; // Mysql password
$db_name=""; // Database name
$tbl_name="jwtdriversbank"; // Table name
$un = "";
$usrname = "";
$usrpass = "";
$userID = "";
mysql_connect("$host", "$username", "$password")or die("cannot connect");
mysql_select_db("$db_name")or die("cannot select DB");
if(isset ($_SESSION['usrName']))
{
$usrname = $_SESSION['usrName'];
}
else
{
echo "4";
}
//var_dump ($usrname);
if(isset ($_SESSION['usrPass']))
{
$usrpass = $_SESSION['usrPass'];
}
else
{
echo "5";
}
$sql="SELECT * FROM jwtdrivers WHERE username='$usrname' and password='$usrpass'";
$result=mysql_query($sql);
$rows=mysql_fetch_array($result);
$userID = $rows['id'];
//var_dump ($userID);
if($userID == "")
{
echo "3";
}
else
{
$TotalProfitPost = $TotalProfit;
$LateFeePost = $LateFee;
$FinesCostPost2 = $FinesCost;
$TravelExpensesPost2 = $TravelExpenses;
$FuelCostPost = $FuelCost;
$CargoDamagePost = $CargoDamage;
$TruckDamagePost = $TruckDamage;
var_dump ($TotalProfitPost);
var_dump($FinesCostPost2);
var_dump($TravelExpensesPost2);
$sql="INSERT INTO jwtdriversbank2 (DriverID, LogID, TotalProfit, LateFee, FinesCost, TravelExpenses, FuelCost, CargoDamage, TruckDamage) VALUES ('$userID', COALESCE((Select MAX(LogID) from jwtdriversbank2 tab2 where tab2.DriverID = '$userID'),0)+1,'$TotalProfitPost','$LateFeePost', '$FinesCostP‌ost2' , '$TravelExpensesPo‌​st2' ,'$FuelCostPost','$CargoDamagePost','$TruckDamagePost')";
$result = mysql_query($sql);
if($result)
{
}
else
{
die(mysql_error());
}
}
Add a primary key for the two columns.
It should do the trick.
Look at this link for help
ALTER TABLE table_name
ADD CONSTRAINT pk_DriverID PRIMARY KEY (DriverID,LogID)
Do not forget to drop the first primary key because you will not need it no more.
EDIT : COMPLETE WITH THE OTHER ANSWER
Here is the code to insert your data.
Insert into <table_name>
values p_RowID, p_DriverID, COALESCE((Select MAX(Log_id) from <table_name> tab2 where tab2.Driver_id = p_DriverID),0)+1;
That should close the question.
You did not defined variable because PHP can't read them.
I opened your program inside VIM editor and I found "<200c>" char inside $FineCostPost2 in the SQL query. You have to change it to make it work.
A quick solution would be to use a subquery to find the maximum log (last log id) then increment it, something like this
Insert into <table_name>
values p_RowID, p_DriverID, COALESCE((Select MAX(Log_id) from <table_name> tab2 where tab2.Driver_id = p_DriverID),0)+1;
Here p_RowID and p_DriverID are the values you pass to insert into your table. The Coalesce function would check the given value and if it is NULL then it would replace it with the second parameter, in this case 0

Resources