I have a file "db-connection.php" that has to be different for each version of my server. (Localhost, Dev and Production). At first I thought .gitignore was the answer, but after much pain and research, I realized that .gitignore only works on untracked file: e.g. files NOT already in the Repo.
For obvious reasons, the localhost version I'm using with xampp requires that the db file be within the repo. Of course, this means that every time I push it to Dev, it ruins the Dev db connection.
Is there a way to tell .git "Yes, I realize this file exists, but leave it alone anyway"?
This is a common problem, and there are two solutions, depending on your needs.
First, if you always are going to have the same configuration files and they will change depending only on the environment (but not the developer machine), then simply create the three versions of the file in your repository (e.g., in a config directory), and copy the appropriate one into place, either with a script or manually. You then remove the db-connection.php file and ignore it.
If this file actually needs to depend on the user's system (say, it contains personal developer credentials or system-specific paths), then you should ship a template file and copy it into place with a script (which may fill out the relevant details for the user). In this case, too, the db-connection.php would be ignored and removed from the repository.
There are two things people try to do that don't work. One of them is to try to keep multiple branches each with their own copy of the file. This doesn't work because Git doesn't really provide a way to not merge certain files between branches.
The other thing people try to do is just ignored the changes to a tracked file using some invocation of git update-index. That breaks in various cases because doing that isn't supported, and the Git FAQ entry and the git update-index manual page explain why.
You can use the skip worktree option with git-update-index when you don't want git to manage the changes to that file.
git update-index --skip-worktree db-connection.php
Reference: Skip worktree bit
Related
I created .env file with params, pushed to github, my teammates downloaded repo. In next push I added .env file to .gitignore. Now I need to make changes to .env file, but how they will get it if .env ignored. What is the right way of doing such of manipulation?
UPDATE:
I used two libraries to manage env variables:
https://www.npmjs.com/package/dotenv
https://www.npmjs.com/package/config
You do not store configured .env file in repository but instead, you create .env.dist (or anything named like that) and commit that file. Your .dist file can contain all keys commented out, or all keys and default values. It's all up to you but you need to ensure your template do not contain any sensitive data too:
DB_HOST=
DB_USER=
The main benefit is that you do not have .env in the repo, so each developer can easily setup own system as he likes/needs (i.e. local db, etc) and there's no risk of having such file accidentally overwritten on next pull, which would be frustrating.
Also (again), you do not store any sensitive data in the repository, sowhile your .env.dist can and maybe even should be pre-configured to your defaults you must ensure any credentials are left empty, so noone can i.e. run the code against production machine (or figure out anything sensitive based on that file).
Depending on development environment you use, you can automate creation of .env file, using provided .env.dist as template (whcih useful i.e. with CI/CD servers). As dotenv file is pretty simple, processing it is easy. I wrote such helper tool for PHP myself, but it is pretty simple code and easily can be ported to any other language if needed. See process-dotenv on GitHub for reference.
Finally, if for any reason config setup is complicated in your project, you may want to create i.e. additional small script that can collect all data and write always up to date config file (or upgrade existing etc).
I'm using git to version control SSIS packages and I know that SSIS generates some crazy XML that is going to badly confuse any merge algorithms.
I'd like to know if having the following line in my .gitattributes file is the correct thing to do:
*.dtsx -diff
I believe this will stop git from attempting to merge the file, which is what I would like.
Am I correct in thinking that this also stops git from generating deltas and therefore stores every change as a whole file? (and therefore, takes up more storage)
My repository also holds the source for the database schema and any other source files, so I'm thinking that switching the repo to fast forward only is not appropriate.
if you don't want files to be merged in git, you need to use the -merge attribute. That way you can still be able to 'diff'.
We also treat packages as binaries, that does imply you will need to do changes multiple times if you need to do a patch from a branch and also need it in your main tree.
I'm very new to version control and git, and I'm trying to learn how to use SourceTree. I have about a dozen commits already and I'm not sure why or how these .baml files were created, but they were not in any of my previous commits, and I'm wondering if can stop tracking and ignore them or not.
If I understand what I've read about baml files is that they are created at runtime so I would think they aren't necessary to track, right?
Wikipedia says they are binary files and it seems as though they are a byproduct of the actual coding work you're doing. In other words, as you say yourself, you didn't create them, so it'd be safe to ignore them (simply add the following line to the .gitignore file in the root of your repository *.baml).
In general, you should use git to keep track of text files you're actually changing. There are, of course, project-specific exceptions to this, but your case doesn't seem to be one.
I'm thinking of Mask as in a circuit Mask (I think)- let me explain with a handy chart
The common source would be physically in c:\source
Instance A would be physically in c:\instanceA but initially have nothing but symlinks to everything in c:\source
Instance B would be physically in c:\instanceB but initially have nothing but symlinks to everything in c:\source
As you made changes to Instance A and Instance B, you would have create a mask that would hide files from CommonSource if they were deleted from the Instance folders and create a new physical file in the instance directory if an existing Common Source file was modified.
New files would live in the instance folders but never make it back to the Common Source.
This type of setup would be very useful for a project where I want to do many different types of small tweaks to multiple instances where distinct threads would work on distinct instances.
I know about symbolic links but they fall short in the case of modifying a file.
Is there anything that can accomplish this? If not, should I try to make this and patent it? Seems like a good idea to me.
I would be on Windows Server 2008 or later.
Fearing I'm stating the obvious, but git is one tool that can be used to achieve this behavior.
Make your "Common Source" a git repository
Clone the repository twice to "InstanceA" and "InstanceB"
In each instance, check out a new, unique branch
As changes are made in "Common Source" you can merge those changes into "InstanceA" and "InstanceB" while maintaining the "MASK" (changes to the branch) you've created for each.
This has the added benefit of allowing changes from "Common Source" to be pulled as you wish instead of having changes to "Common Source" pushed out to each instance (something I imagine would be less desirable and more prone to error).
You're looking for a union mount. Unfortunately, I'm not aware of any implementations for Windows, but there are several available for Linux, notably UnionFS.
In general they are used for making a read-only filesystem look like it's read-write: typically on live-CDs.
Since Windows 7 you can use libraries, which will allow you to include files from more than one physical location.
Windows 7 also include VirtualStore type of folder (for example, when creating or modifying a file in Program Files folder, it will actually be created in a user specific folder:
C:\Users\user\AppData\Local\VirtualStore. However - I don't know how you can create this type of folders yourself, and also, as far as I know, you can add and modify files, but not delete files in that way.
You'll want a versioning control system that supports per file checkout and permissions. Then you just need to set up a simple API converter that takes file-system commands and converts them to versioning control commands.
Delete -> disable permission to access file.
Directory commands should look for local copies and things you have permission to access.
Open -> grab local copy, on fail check-out file from repository.
Save -> disable permission, save local copy. //Avoid duplicates being seen.
Close without saving -> if permission to access from repository, delete local copy.
((By the way, this storage optimization seems somewhat spurious for versioning. Disk space is relatively cheap.
If your interest isn't in versioning, I'd suggest looking into separating out the information you would potentially want as volatile and creating configuration files for each branch. This, of course, requires a predictable pattern to the changes.))
IBM Rational ClearCase is version control system which does file-mask-like behaviour. It is known as MVFS: MultiVersion File System and can be mount to a workstation like a ordinary network drive.
ClearCase server (aka. VOB) you can store several versions of the same file, each on different code branch. The sets of files visible by user are called views. Each view has a configuration (aka. configuration specification), which defines what files and versions are visible for current user. Typical file looks like this:
# From wikipedia: http://en.wikipedia.org/wiki/IBM_Rational_ClearCase#Configuration_specifications
# Show all elements that are checked out to this view, regardless any other rules.
element * CHECKEDOUT
# For all files named 'somefile', regardless of location, always show the latest version
# on the main branch.
element .../somefile /main/LATEST
# Use a specific version of a specific file. Note: This rule must appear before
# the next rule to have any effect!
element /vobs/project1/module1/a_header.h /main/proj_dev_branch/my_dev_branch1/14
# For other files in the 'project1/module1' directory, show versions
# labeled 'PROJ1_MOD2_LABEL_1'. Furthermore, don't allow any checkouts in this path.
element /vobs/project1/module1/... PROJ1_MOD2_LABEL_1 -nocheckout
# Show the 'ANOTHER_LABEL' version of all elements under the 'project1/module2' path.
# If an element is checked out, then branch that element from the currently
# visible version, and add it to the 'module2_dev_branch' branch.
element /vobs/project1/module2/... ANOTHER_LABEL -mkbranch module2_dev_branch
While at home for personal projects i use Mercurial, at work we're using ClearCase.
I am attempting to run a few horizontal (touching lots of source files) refactorings in Visual Studio for the code base, however, for since each file is locked by ClearCase, it has to be unlocked and prompts for the actual activity that the check out is for.
In Mercurial, there's no such concept as far as i'm aware of: files are not being locked at all at any point of time!
Is there a way of doing such a refactoring, or any other operation that acts on multiple files, without having to check out each and every one manually?
In a DVCS (distributed VCS like Git or Mercurial), you simply cannot "lock" a file, since all the other repos wouldn't be aware of such a "status".
But with ClearCase and its locking mechanism (optimist with "unreserved checkout" or pessimist with "reserved checkout"), you need to make a checkout to tell ClearCase you will modify some files.
However, you could also, for large refactoring:
make and update a snapshot view
set all the files as writable (through an OS-based command, not through ClearCase "checkout")
perform your changes
search for all hijacked files and checkout/checkin those files then