Database is not getting updated - sql-server

The application flow I configured in my SSRS/DTSX file is not updating/inserting any record in database.
When I deploy same code on DEV and PRO, it working fine. But when I execute same code in ITG environment it not updating the database. Connection managers connectors are pointing to the correct database. Configuration files dtsConfig also holds the correct credentials.
Am writing logs for each failure, there as well no trace of DB connection or query failure. I executed few queries on database to test the permissions of the configured DB user.
Unable to identify root case of why its not updating/inserting in ITG database. Is there anything am missing to validate.

Related

Error on job that retrieves data from an external FTP server

I started in a large company as an IT consultant. One of my tasks is to manage an application that has a SQL database.
I have very limited knowledge of SSIDB and SQL Server Management Studio - but I am willing to learn.
The SQL database is updated with external data. This can be done by users directly in the application, but it also happens through a scheduled job. The job runs from SQL manager. The job has only two steps, one of which is to execute a dtsx package.
The dtsx package is set up retrieves data from an external FTP server and merges the data into the database. The job was made by my predecessor and has run flawlessly for a very long time.
Now we are in the situation where the FTP, supplying the data, has been changed.
I have therefore been inside the Connection managers and changed to the new FTP server.
Running the jobs however we still get the following error message:
Failed to configure a connection property that has the following path: \Package.Connections[FTPConnection].Propterties[ChunkSize]. An error occurred while setting the value of property “ChunkSize”. The error returned is 0x80020009 “The ValidateDates has been migrated. The package must be saved to retain migration changes.”
I have checked the Connection managers and the ChunkSize is unchanged from when the job was working correctly. ChunkSize is set to 1000, both in the Connection manager, but also in the dtsx package itself.
When I have searched for the problem, it is mentioned that it may have something to do with the connection to the FTP server. So I have checked the connection to the FTP server from the server where the SQL database is located - and there is a connection. In addition to this, I have also made sure that there is a firewall rule that allows traffic between the two servers. This is ensured across protocols and port 20-22
When the job itself is run, however, no traffic leaves the server. So I believe the problem is with the job itself.
Edit: after having done a validation of the package i have gotten the following.
Failed to configure a connection property that has the following path: \Package.Connections[FTPConnection].Propterties[ChunkSize]. An error occurred while setting the value of property “ChunkSize”. The error returned is 0x80020009 “The ValidateDates has been migrated. The package must be saved to retain migration changes.”.
: at Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ConnectionParametersManager.ConfigureProperties(Sting parameterName, object parameterValue)
at
Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ConnectionParametersManager.ConfigureProperties()
at
Microsoft.SqlServer.IntegrationServices.Server.ISServerExec.ProjectOperator.ValidatePackageWithReference(Int64 validationId, Int64 infold, Int64 projectId, String packageName, Int64 versionId, Nullable'1 referenceId, Project isserverProject, Boolean OfflineMode)
I hope my description is comprehensive enough - otherwise please do write follow-up questions.
ps english is not my first language. sorry if something didn't turn out too well.

SQL Maintenance Cleanup task not deleting any files, SQL installed on a DC

The generic problem is as listed here SQL Maintenance Cleanup Task Working but Not Deleting but no solutions applicable. Environment: Windows Server 2012R2, AD DS (with policies of course), RDSH/TS Licensing, 1C-server. The primary problem is SQL Server generating insane amount of events per backup plan run, recording a pair of 18456+17052 errors per file to delete. Errors are as follows:
17052: [Microsoft][SQL Server Native Client 11.0][SQL Server]Login failed for user 'DOMAIN\mssql_srv'
18456: Reason: Could not find a login matching the name provided. [CLIENT: 192.168.x.x] (matches localhost)
Given that each pair of errors appears once per file to delete (there are about 6000 files already!), the algorithm looks like this:
First, backup plan task runs xp_delete_file, it enumerates all the files in target folder;
Second, each file is deleted by creating a separate connection to machine with service's credentials;
Each connection fails due to whatever restrictions default DC policy applies, generating the pair of events. Of course the file remains in place.
The workaround is of course assign file delete task to a local script run as system, for example, but the very reason of why does SQL server fail to delete a file remains unknown. Permissions have been checked and verified that both SQL Server Agent and SQL Server service accounts have full control to the folder.
It turned out that this "login missing" is not a Windows login, but rather SQL "login" which was not present for the service account. So I needed to create a "DOMAIN\mssql_srv" login in SSMS, give it "public" access rights and voila, files started to get deleted properly. The reason is explained in comment:
If it's T-SQL step and job owner is member of sysadmin server role, the step is executed under service account.

SSRS Error Invalid Source credential setting

I have created a report in SSRS and got it working in the design view. When I publish on the report server I am getting the following error.
The current action cannot be completed. The user data source credentials do not meet the requirements to run this report or shared dataset. Either the user data source credentials are not stored in the report server database, or the user data source is configured not to require credentials but the unattended execution account is not specified. (rsInvalidDataSourceCredentialSetting)
I am using shared datasource and using SQLserver database connection as authentication type. Where else could I have gone wrong and Also let me know what are the settings I would have missed to update

What state is my SQL server database in when msdeploy fails on user creation?

I am using msdeploy (version 2) to transfer a database from machine A to machine B.
On in the database on machine A there are some users that do not exist on machine B, thus the transfer (partially) fails with the message:
Error Code: ERROR_SQL_EXECUTION_FAILURE
More Information: An error occurred during execution of the database script.
The error occurred between the following lines of the script: "3" and "5".
The verbose log might have more information about the error.
The command started with the following: "CREATE USER [someDomain\someUser] FOR LOGIN [someDomain"
Windows NT user or group 'someDomain\someUser' not found.
Check the name again. http://go.microsoft.com/fwlink/?LinkId=178587
The database seems to be transfered, except for the user creation. Does anyone know what state the database is in after this failure?
Is there any way I can transfer the database without the users (or better without specific users) using msdeploy?
Web Deploy uses SMO (SQL Management Objects) to script out and apply the scripts for SQL databases, and exposes most of the SMO settings with the dbfullsql provider (so, most of these options: http://msdn.microsoft.com/en-us/library/microsoft.sqlserver.management.smo.transfer_properties.aspx). If you want to skip the users due to this kind of login-not-exists or user-not-found error, you should be able to do this by adding the scripting option: copyAllUsers=false to the source of the sync. For example:
msdeploy.exe -verb:sync -source:dbfullsql="Data Source=.\SQLExpress;Initial Catalog=MySourceDb;User Id=localUser;Password=LocalPass",copyAllUsers=false -dest:dbfullsql="Data Source=RemoteSQLServer;Initial Catalog=MyDestDb;User Id=remoteUser;Password=RemotePass"
Incidentally, I am surprised you note the db appears to have been sync'd - I would expect this is not actually the case. If you have the permissions for it, Web Deploy will create the database if it did not already exist when it initially tries to make the connection, but your failure occurred very early in the script execution, and I believe Web Deploy dbfullsql syncs are transacted by default (the db creation is separate from the script execution and is not transacted). Thus the db may exist where it did not pre-sync, but I wouldn't expect the data to be present in it.

SQL Agent Job - Connection may not be configured correctly or you may not have the right permissions on this connection?

I'm getting this error when running an SSIS package through SQL Agent
Failed to acquire connection "ORACLE ADO.NET". Connection may not be configured correctly or you may not have the right permissions on this connection.
When I log on as the SQL Agent User and run the ssis package directly it is fine. When I then execute it through the SQL agent job, it fails.
I've read around extensively on this topic, and it seems a lot of the advise concerns how you are logged in, configuring of proxy accounts, etc, etc, etc, none of which has been helpful.
I am logging onto an Oracle database with an ADO.NET conncetion. The connection string is as follows (datasource, userid and password have been changed):
Data Source=DATASOURCE;User ID=userid;Password=password;Persist Security Info=True;Unicode=True;
I'm loading this from a registry setting using package configuration. To check that I am getting the correct string, I am writing it into a temporary log table. I am definately getting the string I need from the correct registry setting.
I've tested the oracle login credentials though PL/SQL developer, and it lets me login just fine.
As far as I can tell, as I'm using an explicit user name and password for the Oracle connection it just shouldn't matter who the SSIs pacakge is run as. The only point of failure that Ican see would be the reading of the information from the registry, but that seems fine.
I'm really quite baffled, I must confess, and would appreciate any help some of the splendid experts here can offer.
Many thanks,
James
Ok, tracked this one down after quite a lot of pain.
It was working fine on one environment, but not another, so I fired up Process Monitor (http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx) and ran a package through the SQL Agent job, comparing which system entities were hit on each enviroment.
On the failing environment, at the point of the bulk transfer operation, the package attempted to get the Oracle 11 client DLL, and then hung.
I knew that this was installed, and, moreoever, the DLL path was a system environment setting. After further investigation it was revealed that the server had not been rebooted since the Oracle Client install and the SQL Server Agent process had not bee recycled.
Yes, can you believe it, the old helpdesk fix "Can you reboot your computer?" worked.
Sigh!
We had issues at a client with running packages connecting to Oracle before stored on our sql server instance. The work around we found was to change the package property, protection level, to "Dont save Sensitive Data" and for security purposes, we encrypted the username and password in the package configuration that was decrypted by a udf in sql server. Of course, before you try the whole encryption part, I would recommend putting the username and password in the package configuration without encrypting the values to see if changing the protection level setting is the solution to your specific problem. I hope this helps.
I was getting this error when tnsnames.ora file did not have a valid entry for the environment

Resources