When trying to load a text file using the Azure SQL DW Upload Task in SSIS, I get the following error:
Error: 0x0 at Azure SQL DW Upload Task, Azure SQL DW Upload Task:
Failed to upload to blob storage. Unable to create Azure Blob
container. Endpoint: https://[removed].blob.core.windows.net/,
Container Name: [myContainer]. The remote server returned an error: (403)
Forbidden. The remote server returned an error: (403) Forbidden
Tthe SSIS task is failing.I also tried the BLOB upload task and that fails. Any help is appreciated.
Cause
When a client accesses a storage account using a TLS version that does not meet the minimum TLS version configured for the account ( you have configured the minimum TLS version as TLS1.2), Azure Storage returns error code 400 error (Bad Request) and a message indicating that the TLS version that was used is not permitted for making requests against this storage account.
Resolution
The TLS version used by Azure Feature Pack follows system .NET Framework settings. To use TLS 1.2, add a REG_DWORD value named SchUseStrongCrypto with data 1 under the following two registry keys (depend on .net framework version you are using in visual studio- you can find that from help menu in VS):
HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\Microsoft.NETFramework\v4.0.30319
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft.NETFramework\v4.0.30319
More information
To test that the minimum required TLS version for a storage account forbids calls made with an older version, you can configure a client to use an older version of TLS. For more information about configuring a client to use a specific version of TLS, see Configure Transport Layer Security (TLS) for a client application in this link.
When you enforce a minimum TLS version for your storage account (your storage account configuration sets the minimum TLS version to TLS 1.2) , you risk rejecting requests from clients that are sending data with an older version of TLS.
Reference documentation
https://learn.microsoft.com/en-us/sql/integration-services/azure-feature-pack-for-integration-services-ssis?view=sql-server-ver15#use-tls-12
https://learn.microsoft.com/en-us/azure/storage/common/transport-layer-security-configure-minimum-version?tabs=portal#test-the-minimum-tls-version-from-a-client
For anyone who deosn't want to beat their head against the wall, here was the issue / solution:
I had "PackageProtectionLevel" set to "DoNotSaveSensitive" so I used Package Parameters to configure Password / SecurityToken and then set those as the task values with an expression. SSIS shold have thrown a better error, but at least its solved
I had "PackageProtectionLevel" set to "DoNotSaveSensitive" so I used Package Parameters to configure Password / SecurityToken and then set those as the task values with an expression. SSIS shold have thrown a better error, but at least its solved
Kindly elaborate. I tried changing "PackageProtectionLevel", but no luck.
Related
I Created SQL Server Database in Azure which is serverless and tried to access it using my SQL Server Management Studio in my local but I couldn't get it work.
It always gives me this message:
I tried to whitelist also my IP in Azure but still I get the same result.
Is there a possible way to make it connect?
Is the database currently online or paused?
I'll repeat the text from #David Browne's link:
If a serverless database is paused, then the first login will resume the database and return an error stating that the database is unavailable with error code 40613. Once the database is resumed, the login must be retried to establish connectivity. Database clients with connection retry logic should not need to be modified.
So;
Assuming the database is paused, this is normal operation
Please read docs
You need to retry after the database starts OR manually pre-start it using the Powershell provided in the link below
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-serverless#connectivity
And yes, you also need to whitelist your IP address as you have already done.
Obviously this flavour of SQL is unsuitable for some types of applications - there is more information in the link - I suggest you read the whole thing.
Platform: Google Data Studio
Data Source: MySQL
Connection was working before,
meaning no issues with credentials.
All of a sudden, getting the below error:
All IPs have been whitelisted from the google data studio list of ips.
The only thing that comes to mind is a limitation of GDS to process data.
The data source table has around 200K+ rows.
Not sure what is the limitation for GDS with MySQL.
There's no indication anywhere.
Anyone out there can help to solve this or maybe provide some info would be appreciated.
Thanks
If you use a firewall, be sure to double check the Google ip adresses. They may have added new ips (in my case, the last one was missing).
Check them here !
After doing so, I had to change the Host name of the connection to the database for a url alias (www.yourserver.com <- url pointing on your server), and change it back to the IP to make it work.
Sounds like a the connector cannot establish a new connection.
Cloud SQL Connector:
At the time of writing this, the connector seems unable to establish a new connection once the existing one has timed out and modifying the JDBC url to include query parameters gives you an error when authenticating.
This is probably due to the connector appending it's own parameters.
(Seems to be a possible bug here when a connection no longer exists)
MySQL Connector (with IP Address):
This connector allows you to add query parameters to the JDBC url. Enable SSL and append useSSL=true to the url.
e.g.jdbc:mysql://<ip>/<database>?useSSL=true
This worked as expected and establishes new connections when required.
Example Source Setup
Suffering from this issue too, my experience is that using the MySQL connector instead of the Cloud SQL Connector provides better stability in combination with setting wait_timeout to a value above 12 hours.
This issue has been reported on the official Google Data Studio bug tracker. Please vote them up if you are also suffering from this !
🐛 130205306 MySQL connection does not exist Apr 9, 2019 04:36PM
🐛 118470083 Data source password not stored for MySQL sources. Oct 26, 2018 01:24PM
I get above error while trying to connect oracle 12c. I try using ojdbc6 and ojdbc7 jar files. I found below comment
------------------->
Bug 14575666
In 12.1, the default value for the SQLNET.ALLOWED_LOGON_VERSION parameter has been updated to 11. This means that database clients using pre-11g JDBC thin drivers cannot authenticate to 12.1 database servers unless theSQLNET.ALLOWED_LOGON_VERSION parameter is set to the old default of 8.
This will cause a 10.2.0.5 Oracle RAC database creation using DBCA to fail with the ORA-28040: No matching authentication protocol error in 12.1 Oracle ASM and Oracle Grid Infrastructure environments.
Workaround: Set SQLNET.ALLOWED_LOGON_VERSION=8 in the oracle/network/admin/sqlnet.ora file.
<-------------------
I have one dought to implement above workaround as we have shared database.
If I set SQLNET.ALLOWED_LOGON_VERSION=8 in the oracle/network/admin/sqlnet.ora file will it affect other users ?
Will it affect shared applications and its functionality ?
Setting SQLNET.ALLOWED_LOGON_VERSION=8 in sqlnet.ora affects all connections to the server. You're allowing user authentication with older versions of the password verifier and it affects all users. You can't allow it for just one user. But this isn't going to break other applications that can already connect successfully. It will allow older applications (that use old drivers) to connect too. The best solution is to upgrade all clients if possible but this setting is the workaround and it was made available for this exact purpose.
I am unable to set up a WebLogic 11g data source to our SQL Server database. Can you please help diagnose the error
weblogic.common.ResourceException: Could not create pool connection.
The DBMS driver exception was: [FMWGEN][SQLServer JDBC
Driver][SQLServer]Login failed for user 'carynt\posapp'
The value carynt\posapp is the value I specify for the user. I have attempted various different "AuthenticationMethod" settings as prescribed in the docs.
My unit tests (from within the Eclipse IDE) run successfully. However those use integratedSecurity settings. Is it possible to somehow use similar settings for the WebLogic datasource?
Answering my own question...
We noticed an error in the WebLogic console logs about an unrelated class not being found. Our WebLogic administrator correctly diagnosed it as owing to the tweaked java.library.path (as suggested for using integratedSecurity).
We could not determine which of the myriad WebLogic's startup scripts was initializing the java.library.path value. Obviously my setting of this variable was clobbering the original value. By undoing my changes we noticed that the default java.library.path was the same as the Windows PATH system variable value. Therefore we
Customized the java.library.path by tweaking the Windows PATH value in the startWebLogic.cmd file
Removed userid and password settings in the WebLogic data source definition
Added the integratedSecurity=true setting in the data source definition.
Restarted the server
Paydirt!
We created an ADO.NET Services on top of our EDMX file as the main entry point for our central application. In the near future a lot of distinct applications will show up and consume our REST Service.
So far, so good but there is one thing I'm missing. I don't want to update all my consumers each time a new version of the ADO.NET Data Services is published. How can I achieve such a legacy compliance?
Thank you,
Stéphane.
The data services client and server do not do version negotation at connection time - they do it for every request. Each request or respond includes a version header that indicates what version of client or server is required to service that request. This means that a downlevel client can communicate with an up-level server so long as the server can respond to those requests without doing anything that requires it to up the version number of the response. Features that require the service to use higher version responses are all off by default.
What this means is that as new version of Data Services are published, the client and server will continue to be able to communicate with each other regardless of which version is installed on the client so long as new features have not been enabled on the server that require a higher version client to respond.