I've already an offline updater as a download but I want to make it automatically. It's a client/server software and a ms sql database.
3 parts have to be updated:
update of a proprietary software (setup.exe with next, next, type login/password for server service, finish) which delivers a WinClient and a Service for the server. Autoupdate is only needed for server, the autoupdate for clients by the server works already.
one .hug file which contains new customized program version. The path is a directory in the above software and can be determined with registry.
MS SQL database update with new content (in a .bak file) and structural changes. The database server is sometimes another server as application server. At the moment you have to type in the SQL instance when running the udpater.
My ideas are these:
a updater program in the tray on application server and database server
updater searches with a timer for newer versions on my ftp server and downloads the files if existing
on application server it runs the setup.exe and copies the .hug file. Maybe I could make it silent with command prompt commands. But still need to fill in login/password for the service user.
on database server I configure the autoupdater once with instance name. Updater downloads .bak-file and runs sql script with insert, update and alter statements.
but one big problem which can cause problems is when users are still logged in. Don't know in the moment how I could solve that.
So, how would you realise such a project? Or have you already done something similiar? Is its generally possible?
Related
I am attempting to use a Packaged Solution for my Access 2010 application that has its backend linked to SQL Server. At the moment, I'm using the .accdb file as the frontend, and I would like to distribute my application to some other Windows computers, but the Packaged Solution does not work. I had the package include Access Runtime, so their version of the frontend is running on Runtime and not full Access. However, once the application makes a request to the backend, the application does nothing, as I am not even prompted for the SQL Password as per usual with the full version. I've read on about including a .dsn file in the package can secure the SQL connection (see here), but going through steps of other tutorials to create .dsn files hasn't led to any results. Would anyone know how to correctly generate the .dsn file or if I've done something else wrong at this point?
(And yes, I understanding using Access 2010 in the year 2019 is almost a joke at this point, but I'm doing this for testing purposes. I plan to completely remake the frontend in Angular in the future.)
One other unrelated note... would it be a better idea to have the frontend hosted as a .html file like through the "Publish to Access Services" process? I did read that Access Services was discontinued last year, so would that not be possible?
Edit: This is not a duplicate of "DSN Less Connection (MS Access to SQL2016)" because A) I want to utilize a DSN Connection, not DSN-less and B) I am not using connection strings in my code to hook up with SQL.
You should be able to just create FILE dsn, link your tables, and then distribute the compiled accDE to each desktop.
However, what SQL odbc source provider did you use? If you use the SQL server ODBC provider, then that is by default installed on each computer.
However, if you linked using Native 11 (or later), then that driver is NOT installed on each workstation by default. So, I HIGH recommend you create a FILE dsn (not a user or system DSN), and link the table using that. (Access will create DSN-less links for you)
And you should NOT be seeing a logon prompt with your application. This suggests you forgot or missed the save password option.
So, I would re-link your tables, creating a new FILE DSN. And if you using the linked table manager, then make sure you check the prompt for new location to force creating of a NEW DSN. If you just re-fresh, then you DO NOT get a chance to click on the save password option during the linking process.
So, what odbc driver are you using? The native 11 or later are better, but they are not installed by default on each workstation. However, CAUTION is required here, since the older sql driver does NOT support the newer datetime2 formats. If you used these newer sql column types, they will be returned as string data types in Access and create a mess of issues.
So, first, I would re-link using a FILE dsn.
Make sure you check the save password during the re-link.
You then compile your accDB into an accDE, and then distribute that. You don’t really need to use the package wizard, since once each workstation has the runtime installed, then a simple copy of the accDE to each person’s computer will thus work fine. There is NO special connection between your accDE and the package wizard. Once the runtime is installed, then any and all mdb, accDB, and your accDE can simply be clicked on to launch + run. So for testing, you can skip the package wizard, and just copy the accDE to the target machine, click on it, and see if it works.
Edit
The prompt and check box during this process is this:
So you have to check that box to save the password. Note that you ONLY get this dialog WHEN you create a new FILE dsn.
When I extract a data-tier application from a Microsoft Azure SQL database that has a Master Key, I was unable to import it into SQL server on my local PC.
You will find others had this issue here: SSMS 2016 Error Importing Azure SQL v12 bacpac: master keys without password not supported
However the steps provided as the answer did not work on my installation.
Steps are
1. Disable auditing on the server (or database)
2. Drop the database master key with DROP MASTER KEY command.
Microsoft Tech Support verified this solution did not work on my installation of SQL Server and after actually taking remote control of my PC and trouble shooting, they were unable to determine why this was occurring.
I needed to find a way to remove the Master Key from the bacpac file. I have a Powershell script to remove the Master Key from the BACPAC file but it requires extracting, renaming files and running scripts from Windows Powershell to get the db imported.
Does anyone have a program or set of scripts which would automate the process of removing the Master Key and importing a SQL DB from Azure with a single command?
I am new to this forum. Please do not be harsh with this post. I am trying to do the best I can to help others to save the many hours I spent coming up with this.
I have cobbled together a T-SQL script which calls a Windows Powershell script (also cobbled from multiple sources) to extract a data-tier application (database) from Microsoft Azure SQL database and import it into a database on my local SQL Server by running ONE command. Over the months I found some of the code that is in my scripts from other blogs etc. I am not able to provide the credit due to those folks as I didn't keep track of where I got the info. If you are reading this and you see your code, please take credit. I apologize for not being able to give you the credit for your work.
There may be configuration settings on your PC and your local SQL server that need adjustments as this entire solution requires pretty much full access to your computer. If you run into trouble with compatibility, let me know and I will do the best I can to let you know how my system is configured in case it will help you.
I am using Windows 10 Pro and Microsoft SQL Server Developer (64-bit) v12.0.5207.0
I have placed the two files that do all the work on GitHub here: https://github.com/Wingloader/Auto-Azure-BACPAC-Download.git
GetNewBacpac-forGitHub.sql
GetAzureDB-forGitHub.ps1
WARNING: The Powershell script file will store your SQL sa password and your Azure SQL login in clear text!
If you don't want to do this, don't use this solution.
My computer is owned and controlled solely by me so I am able to open up the security in my system and I am willing to assume the responsibility of safeguarding it.
The basic steps of my solution are are accomplished as follows: (steps 1 and 2 are optional as I like to keep a version of the DB I am working with as of the point in time I pull down a clean production copy of my Azure DB)
Back up the current DB as MyLocalDB.bak.
Restore that backup from step 1 to a new DB with the previous day stamped at the end of the DB name (e.g., MyLocalDB20171231)
Delete the original MyLocalDB database (needed so we can recreate the DB with the original name later on)
Pull down the production database from Azure and create a new database with the name MyLocalDB.
The original DB is deleted in step 3 so that the restored DB can use the original name (important when you have data connections referring to that DB name)
In Step 4, the work of extracting the data-tier application DB from Azure is initiated by this line in the T-SQL:
EXEC MASTER..xp_cmdshell '%SystemRoot%\System32\WindowsPowerShell\v1.0\powershell.exe -File C:\Git\GetUpdatedAzureDB\GetAzureDB.ps1"'
The Powershell script does the following:
The target for the extract is a file named today.bacpac (hardcoded). The first thing to do is delete that file if it already exists.
Extract the DB from Azure into the today.bacpac file.
Note: my DB on Azure has a Master Key for encryption. This will need to be removed from the files prior to importing the bacpac file into your local DB or it will fail (this may not be required in SQL 2017 according to my previous conversations with MS Support). If you do not use a Master Key, you can either strip out the code that does this step or just leave it alone. It won't remove anything if it isn't there. It would just add a little overhead to the program.
Open the today.bacpac file (zip file) and remove the MasterKey node from the Origin.xml file.
Modify the Model.xml file to updates the SHA hash length. This is required in order for the file not to appear to have been tampered with when SQL opens the bacpac file.
Re-zips the files back into a new file today-patched.bacpac
Runs this line of code (from Powershell) to import the bacpac file into SQL Server
&C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe" /Action:Import /SourceFile:"C:\Git\GetUpdatedAzureDB\today-patched.bacpac" /TargetConnectionString:"Data Source=MyLocalSQLServer;User ID=sa; Password=MySAPassword; Initial Catalog=MyLocalDB; Integrated Security=false;"
After editing the two files to provide updated paths, usernames and passwords, run the SQL script. You do not need to edit the scripts again. You can run the SQL script again without modification and it will create a new copy of your Azure DB.
Done!
I recently developed a winform application with c# and SQL Server 2008 data access. I want to create an "InstallShield express" setup file for it (I don't want to use ClickOnce or Setup And Deployment witch is available in VS). I want to create a db or attach it to SQL server instance after installing SQL Server Express 2008 SP3 (not local db). What is the best way to do this?
Your question is quite vague as you do not explain what kind of “app”, “setup file” or “db” you are using, nor how you “attach it to sql”. In the future, please include these details. However, I can give a general answer.
Create a seed database, that contains the starting data for your application, in your source project.
Add the seed database file to your project/solution file and set its Build Action to “Content”.
Ensure your installer includes project content in the deployment folder (the application folder for WinForms apps).
To open the seed database from your app, use a connection string like Data Source=|DataDirectory|seed.sdf. Do not try to search for your seed file or to set DataDirectory yourself; the installer will set DataDirectory to the directory your content was installed to.
Do not try to write to DataDirectory; it may not be writable by the user who installed it. Repairing the app will overwrite DataDirectory, destroying anything you saved there, as well.
If you need to save data in the database, copy |DataDirectory|seed.sdf to Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData), then read and write all data to the copy.
For more information, read my answer to a poster who wrote to |DataDirectory| and therefore kept destroying his user's data.
I have to say that I hate myself for such general question as "What I am doing wrong?" but I simply have no idea what can be the problem:
I've created SSIS package that takes the data from flat files (CSV), counts the average on one of the columns, groups by date and writes it to the database and deletes the original file. All works fine when executed within SSIS, but when I am scheduling it within Server Agent it simply doesn't work - log reports success but there is no new data in the database and the .csv file exists in its original location.
I know the problem with protection level set up in SSIS, so I've changed it to "EncryptAllWithPassword" and I use the same password with Server Agent.
Here is a link to the Server Agent Job script (created as "script job as DROP and CREATE")
Edit: Just to make things weirder, using
dtexcec /f {filepath} /de {password}
executes program without problem. I know I can shedule such command in the Windows itself, but i'd like to keep all scheduled jobs in one place - in the Server Agent
EDIT: Solved by changing the path to UNC
There are two important things to remember when setting up packages to run via a SQL Server Agent job.
Use UNC paths for all file locations, no matter how simple. There is a high probability that the server will have a different view of the file structure than your development machine, so UNC paths ensure that both machines are referencing the same paths.
Use a proxy account to execute that package, as described here http://www.mssqltips.com/sqlservertip/2163/running-a-ssis-package-from-sql-server-agent-using-a-proxy-account/.
The proxy account must have access to the physical paths and the server objects.
This also allows for security stratification on your various packages (not all packages need access to everything).
I developed an Access 2003 application that is connected to SQL Server.
My problem is that I developed the software on my server, and the application runs on the client network on a different (identical) server.
As a result my executable file (Aka. .ADE) does not open on the client's computer, because of bad SQL Server connection.
My solution so far was to open the application file (.ADP) on the client's computer, changing the connection path from there and then creating there the executable file.
Now my client has only Access runtime environment, so I cannot do such thing.
I wonder if there is a way to determine the connection in an ADE file this way.
(I know I can change it through VBA, but when the connection is initially false, I don't even get to the VBA code stage.)
In the interest of keeping things simple, I'll say you need to set up a testing environment you control that mimics your client's environment. For instance, if they have a sql 2008 server named "SQL1", then you should install sql 2008 express on your machine, and rename your machine to "SQL1" so you can test. You'd also need to copy the schema of their database tables and put that same schema in your own test database, and fill it with test data that is similar to theirs. And you'll want to create duplicate logins as well.
With all that in place, I wouldn't think you'd need to update anything. Just copy the ADE file over to your client when you're done making changes. You could try to code your way though this scenario, but I've been there and done that. Having a test environment that apes your client's takes a lot of headaches out of the equation.