We have several SSIS projects on Azure DevOps, everytime I pull the project and another user committed before it destroys my dtproj file. I tried to put the file in the gitignore but then the project do not show me the new files created by other users. I think the problem is that the credentials are stored encrypted inside the file and when the user uploads his credentials it destroys mine.
Any idea how to avoid this error?
Warning loading ETLXXX.dtproj: Warning: Failed to decrypt sensitive data in a project with a user key. You may not be the user who encrypted this project, or you are not using the same machine that was used to save the project. If the sensitive data is a parameter value, the value may be required to run the package on the Integration Services server.
ETLXXX.dtproj 0
This is caused by the package protection level configuration. You should not encrypt the package with the user key in production. This should only be used in the development phase or in some rare cases after being deployed.
I recommend reading more about the SSIS package access control and how it should be changed during the package lifecycle:
Access Control for Sensitive Data in Packages
Securing Your SSIS Packages Using Package Protection Level
Related
I have a local GeoServer running on tomcat which using a PostGIS store to get layers from the PostgreSQL database. There is a production server that runs the same version of my local GeoServer and PostgreSQL database. In order to apply changes in layers and layer groups of my local GeoServer, I copied and replaced the data directory to production GeoServer. After restarting the tomcat on the production server, Geoserver unable to load Layers and Layer Preview pages on the web interface. Trying to change the host address of the store or creating a new one, it gives me this error:
Error creating data store with the provided parameters: Failed to upgrade lock from read to write state, please re-try the configuration operation
You don't say which OS you are using or how you made the copy, but the most likely error is a permissions or ownership one.
Make sure that the user which is running GeoServer has permission to read, write and execute on the data dir. On linux machines I've seen issues with uid and gid differences between machines depending on how the copy id carried out. On Windows I've seen issues just because windows and the virus scanner feel like it.
When using the community function jdbcconfig, the same issues.
It seems that an error occurs because file locking of catalog is performed.
Since the data directory is not used by using JDBCconfig, the file lock has been improved by setting it to Lock Disable.
https://docs.geoserver.org/stable/en/user/configuration/globalsettings.html#file-locking
I have implemented a custom SSIS log provider where the core SSIS log properties are conveyed to a web api endpoint to log to an external system by overriding the Log method. Where I am having difficulty is in understanding how to access the DTS Runtime Parameters to do different things based on the build mode.
In particular I have a Project Parameter in the SSIS project for "FilePath" (ex. "C:\debug.txt") that gets transformed based on the build to "C\release.txt" when in release mode.
I am unable to access the DTS runtime similar to script tasks in the custom log provider that is installed in the GAC so am looking for suggestions.
You probably won't like that answer, however, it is not possible to access the package variables from the custom log provider similar to script tasks.
Referece: social.msdn.microsoft.com
I am trying to configure CI using Visual Studio Online. I've configured GIT and made the build. It is automatically building with every push I make. I want to automatically publish my changes to my database (stored on my server). I've created publish config for that, made the connection string there and put the password in that config. It is working great however is there any way to store the password (other than plain text)?
It is a problem I have tried to fix myself (how do I have an open source project and still have builds with credentials in). The reality is that you can't store credentials in a public place and used by a public server without making them public.
You need to decide whether you trust VSO or not, if you do then you can give it your credentials and if you don't then you can't.
Normal things that you would do such as running the CI process under a service account or giving the account a certificate won't work for VSO because each build happens (seemingly from my testing) on a clean machine each time so you can't pre-configure security settings.
The best that you can really do is to only allow access to your database from known locations i.e. vso and whatever accesses the database rather than everywhere.
ed
I developed an SSIS package which runs fine i VS. To deploy the package, I need to send it to the DBA to deploy on the server but am getting login errors. I've narrowed it down to (what I believe is causing the issue) is the "ProtectionLevel: EncryptSensitiveWithUserKey" setting.
so when the dba opens the visual studio project, he doesnt get the passwords because of the settings and then running the dtsx files fails on the servers.
How do I properly send in the project so he can deploy it without re-typing in the passwords?
http://sqlblog.com/blogs/eric_johnson/archive/2010/01/12/understanding-the-ssis-package-protection-level.aspx
It sounds like Server Storage or Dont Save Sensitive with a configuration package is the way to go for your scenario.
Another option is to store the connection strings that require the passwords as string variables and hard-code the values. This is a pain when you need to change the password, however.
You need to set the SSIS ProtectionLevel (property of the package) to EncryptSensitiveWithPassword. This will force you to add a new password to the package. Your dba will be prompted for this package password when the package is opened. Without the password, access to the sensitive data (i.e. connection passwords within package) will not be possible.
I have a simple WPF application that uses ClickOnce to handle installing. Within this application is a compact database. Through testing I have found that when I publish a new build this database will get overwritten, which is not what I want. Is there anyway I can have fine grained control over what files are updated? I assume ClickOnce is simply checking the hash of the database file, deciding that it has changed and pulling the update.
As a workaround I have since removed the database from the files that are included with the published application so the original remains on the client machine after an update, untouched.
Not a great solution I know
Thanks,
ClickOnce deployments segregate the Application Files into "Include" or "Data file". You can specify what each file is in visual Studio by going to the project Properties page, Publish tab, then clicking the "Application Files..." button. You can then set your .sdf file to "Data File" under the Publish Status column.
Data Files that are downloaded with a ClickOnce application are then placed in a separate directory for each new version.
The idea is that on the first run of the new application version, you go retrieve all the user's private data from their old-version data files and incorporate that data into the new data files which have just been downloaded with your new version.
I think you'll find the information you need at Accessing Local and Remote Data in ClickOnce Applications. Specifically, look at the sections "ClickOnce Data Directory" and "Data Directory and Application Versions."
To access a SQL Server CE database located in your Data directory, use a connection string similar to the following:
<add
name="MyApplication.Properties.Settings.LocalCacheConnectionString"
connectionString="Data Source=|DataDirectory|\LocalCache.sdf"
providerName="Microsoft.SqlServerCe.Client.3.5" />
The "|DataDirectory|" is a special syntax supported by SQL CE and SQL Express and resolves at runtime to the proper directory.
If you so much as open that SQLCE database included in your project, it will change the time stamp on the database, and ClickOnce will deploy it and put the old version under the \pre subfolder.
You might want to consider this method for handling this. Then if you accidentally deploy a new version of the database and don't realize it, you're not hosed. If you intentionally make changes, you can change the database structure of your current database with SQL queries, and pull data from the new copy deployed to the Data Directory (that you're otherwise ignoring) when you need to.
RobinDotNet