I am using Run SQL Command Line to insert my data, the script ask below.
INSERT INTO USERMASTER (USERID,USERPWD,USERNAME,USERPOSITION,USERACCESSRIGHTS,USERSTATUS,CREATEUSERID) VALUES ('admin','nVzfJ0sOjj/EFU700exL6A==','Admin','Administrator','Non-Administrator','1', 'admin');
but when I open my database by using toad and log in the user and see, the data is not insert into the table. May I know where is the place goes wrong?
image below is the output in sql command.
what about commit.
Is autocommit on?
or add 'commit' after your insert statement
Related
I'm trying to run a trigger that allows me to insert the inserted data on my local table to the linked server's table. This is what I did:
use [medb]
ALTER TRIGGER [dbo].[trigger1] ON [dbo].[tbl1]
AFTER INSERT
AS
BEGIN
INSERT into openquery(DEV, 'tbl_remotetbl') select * from inserted
END
but it is giving this error:
Cannot process the object "tbl_remotetbl". The OLE DB provider
"MSDASQL" for linked server "DEV" indicates that either the object has
no columns or the current user does not have permissions on that
object.
What seems to be my problem?
Note: I am using SQL Server 2008 R2
Did you try running the command outside the trigger ? Did it work ?
here is the syntax I'm using in my openquery:
INSERT INTO OPENQUERY(LinkedServer_Name,
'select remote_field_1,remote_field_2 from remote_Schema.remote_table')
select local_column1,local_column2
FROM local_Table
Now, with that being said, making this statement work inside a trigger, is something I couldn't do. Above statement woks perfectly when executed by it self. But once it is placed in a trigger, the entire transaction related to that trigger fails. I mean even the insert statement that fires the trigger does not go through, and the main table does not insert the data which was meant to be inserted in it.
i ran into the same issue, and spent many hours trying to figure out how to make an openquery statement work inside update/insert/delete triggers, with no success....
so, here's an alternate solution, maybe this can fix your issue, this is in a scenario where i need to pass data from MSSQL to a MySQL DB.
first, create a holding table, so you can store temporary info that's inserted into a table that will only hold data that i need to pass to MySQL
create table holding_table (ID int, value2 int)
trigger will insert data to the holding table, instead of sending it directly to MySQL
ALTER TRIGGER [dbo].[temp_data_to_mysql]
ON [dbo].[source_table]
FOR insert
AS
BEGIN
INSERT into holding_table (ID,Stock)select a, b from inserted
END
GO
after that, you can just create a task in the SQL server agent, so it can execute your stored procedure every N minutes.
hope it helps, im aware that this is a workaround, but after some investigation and testing, i was unable to make openquery work called within a trigger process..
I am new to Sql Server but I have always used Oracle a lot. I see that some functionalities are not suppported in Sql Server. Lets say, I want to watch a table for insertion, and what is inserted into that table, I want to copy that same inserted row into another table.
here is my code
Create TRIGGER Post_Trigger ON Posts For Insert
AS
INSERT INTO EmailQueueRaw (UserID,CreatedBy,EmailTypeId,EmailTablePrimaryKey) VALUES('','Arif','1','1');
GO
In Oracle, I used to use New and Old function which are great. But we dont have it in Sql Server and I am not sure what to do here. Please help me how to copy the same data to another table?
You would use INSERTED (and, if needed DELETED) but you need to be aware that they are pseudo-tables and could contain 0, 1, or multiple rows:
Create TRIGGER Post_Trigger ON Posts For Insert
AS
INSERT INTO EmailQueueRaw (UserID,CreatedBy,EmailTypeId,EmailTablePrimaryKey)
SELECT '',ColumnA,'1',ColumnB FROM inserted;
GO
We currently have a c# console app that creates a sync table from triggers that allows the app to send an insert update or delete statement to another copy of the database at another site. This keeps the tables in sync, however the system regularly crashes and we do not have the source code.
I am currently trying to recreate this in sql server.
The sync table is generated in a trigger on each table and adds the
TableName Action RowNumer columns_updated
test Insert 10 0x3
test2 Delete 2
test update 15 07x
From this I can generate an insert, update or delete statement which can be run on the remote server, but with thousands of rows it would be far too slow.
Insert server2.test.column1,column2,column3.column4 select column1,column2,colum3,column4 from
server1.test where row = RowNumber
What I would like to do is generate the insert statement on server1, then simply run it on server2
"Insert column1.column2.colum3.column4 into table1 values 110000,New Order, £99.00, 'John Smith'"
So does anybody have a way to write the insert statement to a table row as a string ready for processing in server 2. This select does not have to happen in the trigger.
i.e. read any row in any table and convert it into an insert statement?
I am using
exec sp_generate_inserts 'TABLENAME'
to copy all the records of one table from our test server to live server.
Do we have to create the table first before using this command?????
I tried to give this command and it gives me an error - Invalid Object name.
I tried to create the table first and then to give this command. But in that case I have an identity column in the table and it gives an error..... Cannot insert identity value .
I know that the identity insert is set to off. Do I have to turn it on and try....How can turn it on.....And also could you please let me know if we have an alternative way of doing it.
Thanks
YES of course you have to create the table first before inserting data into it....
And if you want to insert specific values into an IDENTITY column, you can use
SET IDENTITY_INSERT (your table name) ON
-- do your INSERTs here
SET IDENTITY_INSERT (your table name) OFF
I'm trying to setup a scheduled job that with one step that would insert the results from a sproc into a table.
INSERT INTO results_table EXEC sproc
The job executes and reports a success. Yet nothing gets inserted into a table. When I execute the same script from the SSMS the results are inserted. What might cause the problem?
** EDIT the job is owned by sa and the step is executed as dbo. All the runs in the history are reported as finished successfully. I've tried changing the step to
INSERT INTO results_table(field_names) (SELECT values FROM table GROUP BY column_name)
and it behaves in a similar way
** EDIT the problem only occurs when I select from the master database. Selecting from other tables works fine.
Check if you are inserting in the Master database or the the database that you want to insert. Or call the SP with database instanse inside the job step
Insert Into Results_Table
EXEC <DBNAME>.<SchemaName>.<ProcedureName>
Have you tried inserting the results of the stored procedure into a temp table first then inserting them into your results_table? I would suggest that as well as this article which reviews this concept in-depth: http://www.sommarskog.se/share_data.html
The problem was that in the scheduled job the stored procedure was executed not in the context of master database.