I have a custom moduel that I have been developing (in DNN 7.1) and then testing with the EVS (Extension Verification System). I only have one error left and I am not sure how to trouble shoot it.
Here is the error:
ExtensionMessageID: 664647
ExtensionID: 60892
MessageTypeID: 1
MessageID: b25d95e3-06d0-4241-9729-96f85cfddcbf
Message: While testing against 07.01.00 01.00.00.sqldataprovider
returned an error: Database 'TestSchema' does not exist.
Rule: PackageVerification.Rules.SQLTestRunner
TestSchema is not part of the SqlDatProvider I created. Also, the Sql data provider I created executes fine on my local SQL server.
Does anyone know where this error is coming from. It appears that the EVS cannot create the test database it needs to execute the sql scripts. I wish there was better documentation to the errors/warnings the EVS system generates.
Thanks in Advance
In the SQL install scripts DNN requires the use of two tokens {databaseOwner} and {objectQualifier}. When EVS tests for the correct usage of these tokens, {databaseOwner} it replaced with 'TestSchema' and {objectQualifier} is replaced with 'TestQualifier'. Your install scripts should never reference a database name, as there is no token to that can be substituted out for the database name. In EVS the database names are auto generated by base64 encoding a GUID and they typically look like this (Ll0YaJ7lDkST9pwjmVubuQ).
Do you have a 'USE' statement or possibly a three part object reference?(databasename.databaseowner.objectqualifer_objectname) in that example if you removed the first part and then put in the tokens it would look like this ({databaseOwner}{objectQualifier}objectname).
The typical errors I found when using the EVS test with regard to the Azure script compatibility is when you include the "WITH PRIMARY" storage directive in your CREATE statement. Make sure to remove these directives as SQL Azure doesn't like them.
Related
I am trying to copy data from dynamics to dynamics; we have a SQL DB for the source dynamics environment. I have connected to that DB via SSMS, and I am doing all the transformations and creating views in SSMS, after that I am linking them to the pipeline and pushing data to the destination environment. Pipelines were working fine till 23 May 2022, and suddenly, I was getting this error in one of my pipelines.
ErrorCode=DynamicsOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Dynamics
operation failed with error code: -2147220956, error message: Sequence
contains no
elements.,Source=Microsoft.DataTransfer.ClientLibrary.DynamicsPlugin,''Type=System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault,
Microsoft.Xrm.Sdk, Version=9.0.0.0, Culture=neutral,
PublicKeyToken=31bf3856ad364e35]],Message=The creator of this fault
did not specify a
Reason.,Source=Microsoft.DataTransfer.ClientLibrary.DynamicsPlugin,'
Not sure what I am missing. I am assuming that while importing data, some columns have null values, but I had some null values before, and it was working fine. Any thoughts on how to resolve this issue will be great.
This is the SQL statement for creating a view
If you ever come across this kind of issue in dynamics. Please check the workflow and plugins that are related to that entity. In my scenario there was a plugin which was on contact entity and it was faulty and I switched off that plugin and the pipeline started working
switch of the plugins first and turn them on one by one and you can figure out which plugin was faulty.
Wish you all a happy new year!
I periodically import data from CSV files downloaded from a web application. I have created the linked server LS_Text for the local path with the CSV files. I import the file content with INSERT INTO ... SELECT queries in a stored procedure.
This works fine as long as I log in to SSMS as SA. But if I log in with my Windows authentication, I get the error
Cannot initialize the data source object of OLE DB provider
"Microsoft.ACE.OLEDB.12.0" for linked server "LS_Text".
I plan to import the files by an application that calls the stored procedure. So Windows authentication has to work.
What means the error message? The same error message results if the file path does not exist. So it looks as if SQL Server or the OLEDB provider can't see the folder with the CSV files. But I have saved the files myself with my credencials.
I have created the linked server with the following batch:
EXEC sp_addlinkedserver #server = N'LS_Text', #srvproduct=N'CSVFLATFILE',
#provider=N'Microsoft.ACE.OLEDB.12.0', #datasrc=N'C:\Trans', #provstr=N'Text;HDR=Yes';
EXEC sp_addlinkedsrvlogin #rmtsrvname = N'LS_Text', #useself = 'false',
#locallogin = NULL, #rmtuser = NULL, #rmtpassword = NULL;
EXEC sp_MSset_oledb_prop N'Microsoft.ACE.OLEDB.12.0', N'AllowInProcess', 1;
As far as I understand, #useself = 'false', #rmtuser = NULL, #rmtpassword = NULL means that the linked server can be accessed without login and password. I have tried all sort of other combinations, yet without success.
Articles found on Google for this error message deal with OPENROWSET rather than linked server. Or with ACE driver configuration. But this is not the issue since it works as SA.
So how can I query the CSV files with Windows authentication? Any hint is appreciated.
Perhaps this is not a full answer, but it should hopefully help you debug this kind of issue better yourself. Since, as you mention, it works as sa, the likely problem is related to the user/login mapping. There's an example on the doc page sp_addlinedsrvlogin doc describing how to try a specific Windows login. That might be worth trying for your credentials to see if that works. Second, there are ways to delve into what is happening in the server's code path to load and use the provider. A reasonable blog post can be found here which is about talking to Oracle but the important content is about how to set up trace events for linked servers and see what is happening once you start trying to execute your query. (linked server vs. openrowset to a linked server should not matter, but please note that the term openrowset was overloaded in SQL to allow different code paths including some that don't directly go through OLE/DB or not through this specific OLE/DB provider, as David mentions in the comments to your question). So, tracing the actions before the error may point out a spot where things have failed differently in your windows login case vs. the sa/admin path. Finally, as the Jet (now ACE) provider is fundamentally a DLL that gets loaded into the SQL Server process and then does file system operations to try to load a file, it may be valuable to just use procmon to monitor the process and see if there is some operation that is failing (such as reading a registry key or opening a file inside the provider). It doesn't seem to be the most likely problem given that sa works for you, but it may be a useful tool.
You also asked about the error message. I'll try to explain. (I wrote the original Jet OLE/DB provider that was later renamed to ACE after I changed teams). In OLE/DB, there are COM interfaces that conceptually "live" on 4 main internal classes. You can see this documented here in the OLE/DB programmer's guide. The Data Source object is the top-most object and it means somewhat different things to different data sources. The second-level concept is a "session". In Jet/ACE, these two concepts are not really different as you just have a connection to a file, but in SQL Server and other server things, the data source object is a connection to a server and the session is an individual connection to that server. The error you are getting is saying that the initial connection/authentication to the provider is failing. It means one of N things, but I'd start with examining "the mapping from SQL's login to the login for Jet/ACE is not working properly".
net-net - if you can load CSVs through the normal paths (openrowset(BULK + CSV format), your life is probably going to be better in the long-run as David suggests.
Best of luck debugging your problem whatever path you pick.
I'm trying to use SSIS to load some data from Oracle database to MSSQL database.
I created the project and used the ADO.Net source and was able to create a connection to Oracle and run queries and view results.
However when I actually run the package I get the following error:
Error: 0xC0208449 at Data Flow Task, ADO NET Source 2: ADO NET Source has failed to acquire the connection {EECB236A-59EA-475E-AE82-52871D15952D} with the following error message: "Could not create a managed connection manager.".
It seems similar to the issue here
And I did find that I have two oracle clients version installed "11.1" and "12.2".
One is used by PL/SQL and the other by other entity framework project.
If this is the issue I just wanted a way to tell the SSIS to pick-up the correct one.
I tried adding Entry in machine.config for "oracle.manageddataaccess.client" section with the desired version.
I also tried using other types of data sources but couldn't even create a successful connection
I tried changing the Run64bitRuntime property in the project to False
Note: I don't have SSIS installed on my machine.
Eventually, I just had to remove the entries related to 11.1 in path variable then restarted my machine.
Also I switched to "dotConnectForOracle" for connection and now it seems to be working fine.
I'm expecting issues related to other applications that might still be using the 11.1 version, but that will be a problem for another day.
Always make sure to write the user (oracle schema) in uppercase and some special characters [in my case it was $] in the password needs escape character even if you're using the wizard not the cmd
I still don't understand the whole issue but I hope this helps someone some day.
I've created a package that gets some information from a sql database and inserts them into Dynamics CRM.
When testing the package from Visual Studio everything goes as expected and the task finishes without any errors and the rows get inserte . However , when I publish the package to SSISDB on Sql Server the package fails with this error :
KingswaySoft.IntegrationToolkit.DynamicsCrm.CrmServiceException : CRM service call returned an error : A password is required in order to establish the connection ...
I tried changing the package protection level to EncryptSensitiveWithUserKey but it still gives the same message as above , created the package again from scratch still doesn't work . This package was working before , maybe there's something I did the last time in configuration which made it work but I cannot replicate it anymore .
Also I tried the Integrated Authentication it says this :
KingswaySoft.IntegrationToolkit.DynamicsCrm.CrmServiceException : CRM service call returned an error : The caller was not authenticated by the service .
#Drinv, this is a typical SSIS runtime deployment issue. You need to make sure that you have provided a password for your job configuration for the connection manager. What you provided to the package doesn't count as far as sensitive fields are concerned (password being one) when you are using the EncryptSensitiveWithUserKey option since user key is not transferrable between different systems or different users. An easy workaround is to change your SSIS package/project's ProtectionLevel setting to encrypt using a password instead, although it may not be the best practice. If you still have trouble getting this going, please reach out to us directly, our team can walk you through the issue.
I found out what I was doing wrong .
My SSIS project was on Project Deployment Model and I was trying to deploy only the package. After making my connections available on project level and deploying the whole project everything worked as expected .
I have created a SSIS package and I am trying to run it locally. We use package configurations that point to sql tables and a XML config file. The package ran successfully for about a week, even when deployed to a SQL Server Agent Job in our STAGE environment.
Now, the only way I can get the package to run is by not using the Package Configurations and choosing EncryptSensitivewithPassword. If I change the package to DontSaveSensitive, I continuously get the error below:
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E4D Description: "Login failed for user 'Test_User'.".
Error: 0xC020801C at AgentCompany, Lookup [37]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Test" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
It is so strange that about a week ago, this package ran fine with the configurations and the DontSaveSensitive Option.
I have updated the config file to ensure that it is establishing the connection string to the appropriate database. I also test the connectivity on the connection managers and they all test successfully.
I also double checked the SQL Database where the user is trying to connect to ensure that it has permissions there and it does.
I am very confused. Please Help!
Updating dtsconfig file
Re-creating the connection managers
Making some DFT task DelayValidation to true
Changing the RunTime to 32 bit
EncrpytPasswordSensitive with package configs removed---This works but this is not the standard at my company and this is not how I developed and tested the package before
When you open/run a package, an OnInformation event is fired that says something like
The package is attempting to configure from the XML file "c:\ssisdata\so_56776576.dtsconfig".
When Visual Studio/SSDT opens/runs a package which says it uses configuration but for reasons, cannot get them, you should then see messages like
Warning loading so_56776576.dtsx: Failure importing configuration file: "c:\ssisdata\so_56776576.dtsconfig"
and
Warning loading so_56776576.dtsx: The configuration file "c:\ssisdata\so_56776576.dtsconfig" cannot be found. Check the directory and file name.
and
Warning loading so_56776576.dtsx: Failed to load at least one of the configuration entries for the package. Check configuration entries for "Configuration 1" and previous warnings to see descriptions of which configuration failed.
If someone has manually edited the config file and broken the XML, you'd see a warning like
Cannot load the XML configuration file. The XML configuration file may be malformed or not valid
The important thing to note with regard to configuration - if a configuration cannot be found, SSIS will continue along with the design time values. That is why it is crucial to check the warnings emitted when your package runs. If you are running manually, ensure that you have /rep ew specified so you report Errors and Warnings.
Guesses as to root cause
The package has the protection level of EncryptSensitiveWithUserKey which means the AD credentials of the package creator are used to hash things that might have sensitive information in them. I could be using AD authentication in my connection string and specify that the connection should be trusted but that entire block is still going to get encrypted against my Active Directory account. When you come along and attempt to maintain the package, it's not going to be able to decrypt the sensitive data as you are not me.
The two ways around that are to use a shared key (EncryptSensitiveWithPassword/EncryptPackageWithPassword) which is cumbersome to deal with plus it goes against the whole spirit of secrecy since everyone knows the secret. The other approach as you've identified is DontSaveSensitive and that's my go to for all of this.
The problem to be overcome is that with DontSaveSensitive is that every time you save, SSIS is going to wipe out any knowledge of user name and password from places that might be holding on to it - like a connection manager. The 2005/2008 strategy to hedge against this was to use Configuration or explicit overrides at run time to supply user name and password. My typical approach was to use configuration based on a table instead of XML as I was better at securing sensitive data in a table than I was mucking with ACL on the file system. The other challenge we had with multiple developers and file based configuration was that either everyone had to set their file systems up the same (and we developers are unique rainbow snowflakes so that's unlikely) or we need to use a network shared file which is great until someone adds their own values to it and breaks it or removes your changes or any of a host of other challenges.