Sending zip files and keeping file paths consistent - database

I have created a GUI and Database for my company. I am trying to alpha test the program. I made the program so that everyone would place it on their C drive so that the file paths would stay consistent. When I email the zip folder to everyone it adds an extra folder that is causing errors to the file path (I believe the error occurs during extraction?). Does anyone know a good way to prevent this from happening? Thanks!

Although "xcopy deployment" is a valid method to deploy programs, it can come with complications, as you have discovered. Instead, you can create an actual installer program which is much more versatile.
For a lead-in on making an installer you can read Create MSI installer in Visual Studio 2017.
Make sure that the program uses locations as given in the Environment.SpecialFolder Enumeration so that it is automatically adapted for any (properly-configured) Windows installation.
Other installers are available, e.g. Inno Setup, which may offer simpler or more detailed configuration of some options like replacing or keeping older files, or installing prerequisites like a required framework version.

Related

Checking installation integrity with installshield

For Linux packages, specifically RPMs with stored checksums, we always can check two things: the contents of package is ok and the installation from this package is ok. When someone modifies parts of the installation he shouldn't, we can see it by running rpm -Vp my-precious-package. In our busyness it is not only recommended, but obligatory to provide our packages with tools for this purpose and for Linux these are just simple bash scripts.
Now I have to do something similar for Windows. Basically what I want is to provide some batch file by running which one can get assured, the installation is the same as it meant to be in the package. I'm using InstallShield for packaging, and yet it has some great visual tools, I still haven't found a way to verify package checksums in the command line.
Is it even possible, or should I reinvent the wheel writing my own checking utils?
Take a look at MakeCat and SignTool from Microsoft, both in SDK
http://msdn.microsoft.com/en-us/library/windows/desktop/aa386967%28v=vs.85%29.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/aa387764%28v=vs.85%29.aspx
Windows Installer has a feature called resiliency that supports auto repair of products and there are ways to call it for self checks only. (This is assuming by InstallShield you mean Windows Installer based projects.)
Here's a couple links to read to get you started:
INFO: Description of Resiliency in Windows Installer
Resiliency
Application Resiliency: Unlock the Hidden Features of Windows Installer
MsiProvideComponent function (See dwInstallMode flags)
This also assumes all files are key files. Companion files are not managed by the installer. Also changes performed by custom actions outside of the installer aren't managed.

Create Software Distribution Packages From Visual Studio

I would like to setup an automatic software distribution process, preferably from Microsoft Visual Studio, which builds my projects in all the different configurations and platforms, and packages all the created objects in a predefined folder tree structure.
The software distribution packages would be for Windows libraries and WDM driver projects written in C/C++. Each library has several different configurations (i.e. Windows 7 Release, Windows XP Release, MT/MD runtime compilation flags) for different platforms (i.e. x86 and x64). A similar thing is with the drivers. Without any automatic process to create a software distribution package, it's necessary to build all the different configurations for each platform and then copy the created objects to a predefined folder structure and then zip the created folder giving it a release name and version. This process is quite time consuming and error prone. Therefore, my goal is to automate this process using a clean a nice solution.
I've been researching about this for a few weeks already and have actually implemented a few different solutions. However non of the solutions I implemented until now is flawless whatsoever. Hence since this should be a problem that I guess many developers have already encountered, I would like to hear different opinions on what would be a nice and efficient way to do it.
Up until now I've tried the following:
A batch script and a Makefile to be used by NMAKE. This is not so good because it makes difficult to set the same build parameters that are set on the visual studio project.
Implemented a "deploy" target task (editing the .vcsproj files) which calls MSBuild of the project for each configuration/platform and copies the generated files to a distribution directory. This has the advantage that I can start the deploy activity from within visual studio but it also produces several environment variables problems, specially when building windows drivers.
Any ideas or suggested solutions will be appreciated.
Thanks in advance.
Zion
If you haven't already, add a post-build step for each lib and driver which copies the built files into your specific tree and also zips them.
If you haven't already, create one Visual Studio solution (.sln file) which builds all these projects at once.
If you haven't already, set up Build configuration using the Build | Configuration Manager dialog. Now from the IDE, you should be able to specify a specific configuration and do a Build | Rebuild Solution and make sure all the projects are successfully built.
From the command-line, you can now automate #3 by opening a Visual Studio command line prompt (which sets up the environment variables appropriately). Start devenv.exe with appropriate command-line parameters.

Transforming MSI files for Active Directory deployment

I'm currently rebuilding all our setup files (used to be .exe files) into MSI installers so that I can deploy them through Active Directory.
Basically, we have a tool (Windev for those who know it) that generates MSI files, but since our tool is limited, I generate .mst files (using Orca) to fine-tune the setup, and merge the source installer and the transform with msitran.exe. When I manually install the transformed setup file with /qb or /qn, the install works fine.
But when I try to add the setup file in the AD, I get an error message : "Unable to extract deployment information".
I found multiple solutions online, but none of them fit my problem (ie. I have all admin rights, my final MSI validates in Orca (with a couple warnings, but no errors). I get the same issue with the original MSI that our tool generates (but it doesn't validate in ORCA, hence the transform to fix that among other things).
Are there any properties that I need to set to successfully deploy my MSI to AD ?
Never mind, solution is plain stupid, but MS does not mention this anywhere.
Basically, it was an issue with the UNC path. My MSI was located in a path that contained whitespaces, and Windows Server 2003 apparently doesn't like that (wrapping the path in quotes in the open dialog does not change anything). Moving the files to a path without spaces fixed it.
Good day to all

Database errors in Quantum Grid demos in Delphi XE Professional

Whenever I open one of the Quantum Grid demos in Delphi XE Pro (on Windows 7 32-bit), the following error is displayed for every table (I think) in the project:
error message http://www.tranglos.com/img/qgerror.png
The message is:
Network initialization failed.
File or directory does not exist.
File: C:\PDOXUSRS.NET
Permission denied.
Directory: C:\.
I understand permission issues writing to c:\, but the result is that while I can build and run the demo projects, no data is displayed, which makes the demos rather useless. And what kind of database writes its configuration to c:\ directory in the 21st century anyway? :) (Yes, I know very little about Paradox databases, but I won't ever be using one either. I just want to learn how to use the grid.)
Using BDE Administrator I've tried changing the Paradox "NET DIR" value to a folder with write permissions on the C drive. Result: now the database tables cannot find their data:
Path not found.
File: C:..\..\Data\GENRES.DB.
...and the unhelpfully truncated path gives no indication where the files are expected to be.
Is there a way to work around the problem so that the demos can load their sample data correctly?
Did you install the BDE correctly? It should use the DBDEMOS files. Do you see such an alias in the BDE administration utility? Can you open that database in one of the Delphi demos?
The BDE is not a XXI century database, it was developed twenty years ago and never upgraded lately. It's an obsolete tecnology, but because it comes still with every release of Delphi with a known database it is still often used in demos because nothing new has to be installed.
Anyway that file is not its configuration file. It's a sharing lock file to allow more than one user to use the database concurrently. Because it is a file based database without a central server, it has to use such kind of shared files. Usually its position is changed to a network share, but it defaults to C:\ for historical reasons.
Anyway it's not only the BDE still attempting to write in the prong directories. I still see a full bunch of applications attempting to write to C:\ (especially logs) or other read-only positions.
Using BDE Admin to change the location for PDOXUSRS.NET helped, but it wasn't sufficient. DevExpress did the right thing in specifying a relative folder for the data location, and the relative folder seems perfectly allright, but for some reason the DB can't find it.
Solution: under the \Demos\ folder find all the *.dfm files that contain the string
..\..\Data
and replace that string with the absolute path to the demos folder. That done, all the demos open correctly.
I know this message from our own applications. It has to do with security measures introduced with Windows Vista. The operating system trying to protect critical files denies access to them. There is a method how to bypass this mechanism without compromising security. Try to run your application in compatibility mode. When application is running in compatibility mode, read / write operations from / to system folders are redirected to "safe" directories located in C:\Users[Current User]\AppData\Local\VirtualStore.
More info on http://www.windowsecurity.com/articles/Protecting-System-Files-UAC-Virtualization-Part1.html.

What is a good way to build a lot of small tools in Visual Studio?

Suppose you have some source code that comes from the unix world. This source consists of a few files which will create a library and a lot of small .c files (say 20 or so) that are compiled into command-line tools, each with their own main() function, that will use the library.
On unixy systems you can use a makefile to do this easily but the most naive transformation to the windows / Visual Studio world involves making a separate project for each tool which, although it works, is a lot of work to set up and synchronize and more difficult to navigate at both the filesystem and project/solution level. I've thought about using different configurations where all but one .c file are excluded from the build but that would make building all the tools at once impossible.
Is there a nice way of building all the tools from a single "thing" (project, msbuild file, etc.)?
I'm really not interested in using cygwin's gcc/mingw or NAnt. I'd like to stick with the standard Windows toolchain as much as possible.
You don't HAVE to use visual studio to compile code. You can make your own batch file or Powershell script that simply calls the compiler on your source, just like a makefile.
So I've been looking into this for a while now and the solutions all leave much to be desired.
You can...
Create a lot of small projects by hand.
Use MSBuild and deal with its steep learning curve.
Use a build tool that does not integrate well with Visual Studio, like GNU make.
You can't even make a project template like you can with .NET projects! Well, you can make a wizard if you want to wade through the docs on doing that I suppose. Personally, I have decided to go with the "many small projects" solution and just deal with it. It turns out it can be less horrible than I had thought, though it still sucks. Here's what I did in Visual Studio 2008:
Create your first Win32 command line tool project, get all your settings down for all platforms and make sure it works under all circumstances. This is going to be your "template" so you don't want to edit it after you've made 20 copies.
(optional) I set up my paths in the visual studio project files so that everything is built in the project directory, then I have a post-build step copy just the dll/exe/pdb files I need to $(SolutionDir)$(OutDir). That way you can jump into a single directory to test all your tools and/or wrap them up for a binary distribution. VS2008 seems to be insane and drops output folders all over the place, with the default locations of Win32 and x64 output differing. Spending a few minutes to ensure that all platforms are consistent will pay off later.
Clean up your template. Get rid of any user settings files and compiler output.
Copy and paste your project as many times as you need. One project per tool.
Rename each copied project folder and project file to a new tool name. Open up the project file in a text editor like Notepad++. If you have a simple, 1-file project you'll need to change the project name at two places at the beginning of the file and the source code file name(s) at the end of the file. You shouldn't need to touch the configuration stuff in the middle.
You will also need to change the GUID for the project. Pop open guidgen.exe (in the SDK bin directory) and use the last radio button setting. Copy and paste a new GUID into each project file at the top. If you have dependencies, there will be one or more GUIDs at the bottom of the file near the source code. Do NOT change them as they are the GUIDs from the dependencies and have to match!
Go into Visual Studio, open up your main solution and add your tool projects.
Go into the configuration manager and make sure that everything is correct for all supported platforms, then test your build.
It's not beautiful, but it works and it's very much worth the setup time to be able to control your builds from the GUI. Hopefully VS2010 will be better about this, but I'm not too hopeful. It looks like MS is giving a lot more love to the .NET community than the C/C++ community these days.
If you have a makefile you can use a 'makefile' project in Visual Studio (which in misnamed - it simply allows you to specify custom build/debug commands), and use it to invoke GNU make.
You will need to change the makefile to use the VC++ command line tools instead of cc or gcc or whatever it uses, but often these are specified by macros at the top of the makefile.
If the makefile uses other Unix specific commands (such as rm), you may need to make modifications, or create bath files to map commands to Windows equivalents. Another option is to install any necessary tools from GNUWin32 to make it work.
If the build is very complex or involves configure scripts, then you have a harder task. You could generate the makefile from a configure script using MSYS/MinGW, and then modify it as above to make it work with VC++.
Makefile projects will not be as tightly integrated in Visual Studio however. All the build management is down to you and the makefile.
If you're really using Visual Studio, I would suggest creating a project for each tool, and adding these projects to a single solution. From Visual Studio, it's easy to build a complete solution all at once, and MSBuild knows how to build .sln files as well.
msbuild myslnfile.sln
or even:
msbuild
... will build your solution.

Resources