I'm working with clearcase in Unix.
I accidently execute a shell, which make some file in the directory: /vobs/somePath/myDir.
I execute the command below in the directory:
cleartool ls -l
And I do get some view private object files.
What I need is to recover this directory with my baseline, which has been set before. Besides, I have some uncheckin files in other paths and I don't want to check them in right now. In other words, I just want to recover the directory myDir and don't touch any other files.
How to achieve this?
For a dynamic view (/vobs/avob/myview/...), only private files are writable, so you can delete everything and it will only delete the private ones.
But, if you have checked out files (which should not be deleted), or if you don't want to risk anything, you can clean just the private files with, using cleartool lsprivate:
cd /vobs/somePath/myDir
ct lspriv . | grep -v checkedout | xargs rm -rf
Related
UNIX mkdir has the -p flag that creates the parent directories if they don't exist. Is there an equivalent in cleartool for that? Obviously cleartool mkdir foo/bar/ doens't work when foo doesn't already exist.
The simplest way is to create a file in the directory structure and use mkelem -mkpath...
I created a "temp1" directory, and a "temp2" directory, and a "temp.txt" file in one of my sandbox vobs, then added the file to source control with -mkpath from the parent of the "temp1" view private directory.
PS M:\tempview\foobarf\Documents> cleartool mkelem -mkpath .\temp1\temp2\temp.txt
Creating parent directory element ".\temp1\temp2".
Creating parent directory element ".\temp1".
Created directory element ".\temp1".
Checking out parent directory ".\temp1".
Created directory element ".\temp1\temp2".
Checking out parent directory ".\temp1\temp2".
Creation comments for ".\temp1\temp2\temp.txt":
Test1.
.
Created element ".\temp1\temp2\temp.txt" (type "utf16le_file").
Checked out ".\temp1\temp2\temp.txt" from version "\main\0".
It's UTF-16LE because Powershell's Locale is wide character and the "temp.txt" was created using "dir > temp1\temp2\temp.txt".
cleartool mkelem -mkpath is a good option when adding one file.
But if you have multiple files to add, in a tree structure which does not exist yet, don't forget about clearfsimport: it can import flat files to a branch in one command, and will create any missing folder for you.
As seen here: clearfsimport -rec -nset <Source> <target>
I was trying to migrate a file from directory A to directory B in a branch, call it file.txt. What I did was:
cd A
cp file.txt ../B/
ct rm A
cd ../B
ct mkelem -ci -nc file.txt
Thereby losing all the history. I am trying to recover from this to do what I should have done which is simply ct mv file.txt ../B
I read that for this I should do something like this:
cd A
ct ln .##/main/?/file.txt ./file.txt
where luckily, from another view, I've figured out ? should be 27. Unfortunately when I try to do the above I get:
cleartool: Error: Entry named "file.txt" already exists.
cleartool: Error: Unable to create link: "./file.txt".
and I try to do:
ct rmelem file.txt
but got:
cleartool: Error: Element "file.txt" has branches not created by user
though presumably that's not what I should be doing anyway. How do I get back that file? It was simply a ct rm. I even get the entry already exists error if I do ct rm on the new copy file I added to directory B..
You are on the right track, but I would recommend a simple rmname, instead of a rmelem (which deletes the element with all its versions, branches and such).
That would remove file.txt from the latest version of the parent directory, and allows you to proceed with the symlink.
Next time, a cleartool mv might be easier, and keep the history of the file being moved.
Using Terminal and Shell/Bash commands is there a way to retrive specific files from a web directory? I.e.
Directory: www.site.com/samples/
copy all files ending in ".h" into a folder
The folder contains text files, and other files associated that are of no use.
Thanks :)
There are multiple ways of achieving this recursively:
1. using find
1.1 making directorys using find -p to create recursive folders without errors
cd path;
mkdir backup
find www.site.com/samples/ -type d -exec mkdir -p {} backup/{} \;
1.2 finding specific files and copying to backup folder -p to perserve permissions
find www.site.com/samples/ -name \*.h -exec cp -p {} backup/{} \;
Using tar well actually for reverse type of work i.e. to exclude specific files which the part of the question related to text files matches this answer more:
You can have as many excludes as you liked added on
tar --exclude=*.txt --exclude=*.filetype2 --exclude=*.filetype3 -cvzf site-backup.tar.gz www.site.com
mv www.site.com www.site.com.1
tar -xvzf site-backup.tar.gz
You can use the wget for that, but if there are no links to that files. I.e. they exist, but they are not referenced from any html page, then bruteforce is the only option.
cp -aiv /www.site.com/samples/*.h /somefolder/
http://linux.die.net/man/1/cp
What's the simplest way on Linux to "copy" a directory hierarchy so that a new hierarchy of directories are created while all "files" are just symlinks pointing back to the actual files on the source hierarchy?
cp -s does not work recursively.
I just did a quick test on a linux box and cp -sR /orig /dest does exactly what you described: creates a directory hierarchy with symlinks for non-directories back to the original.
cp -as /root/absolute/path/name dest_dir
will do what you want. Note that the source name must be an absolute path, it cannot be relative. Else, you'll get this error: "xyz-file: can make relative symbolic links only in current directory."
Also, be careful as to what you're copying: if dest_dir already exists, you'll have to do something like:
cp -as /root/absolute/path/name/* dest_dir/
cp -as /root/absolute/path/name/.* dest_dir/
Starting from above the original & new directories, I think this pair of find(1) commands will do what you need:
find original -type d -exec mkdir new/{} \;
find original -type f -exec ln -s {} new/{} \;
The first instance sets up the directory structure by finding only directories in the original tree and recreating them in the new tree. The second creates the symlinks to the original files in the new tree.
There's also the "lndir" utility (from X) which does such a thing; I found it mentioned here: Debian Bug report #301030: can we move lndir to coreutils or debianutils? , and I'm now happily using it.
I googled around a little bit and found a command called lns, available from here.
If you feel like getting your hands dirty
Here is a trick that will automatically create the destination folder, subfolders and symlink all files recursively.
In the folder where the files you want to symlink and sub folders are:
create a file shell.sh:
nano shell.sh
copy and paste this charmer:
#!/bin/bash
export DESTINATION=/your/destination/folder/
export TARGET=/your/target/folder/
find . -type d -print0 | xargs -0 bash -c 'for DIR in "$#";
do
echo "${DESTINATION}${DIR}"
mkdir -p "${DESTINATION}${DIR}"
done' -
find . -type f -print0 | xargs -0 bash -c 'for file in "$#";
do
ln -s "${TARGET}${file}" "${DESTINATION}${file}"
done' -
save the file ctrl+O
close the file ctrl+X
Make your script executable chmod 777 shell.sh
Run your script ./shell.sh
Happy hacking!
I know the question was regarding shell, but since you can call perl from shell, I wrote a tool to do something very similar to this, and posted it on perlmonks a few years ago. In my case, I generally wanted directories to remain links until I decide otherwise. It'd be a fairly trivial change to do this automatically and recursively.
I want to check in a directory and all the sub-directories into the clear case.
Is there a specific command to achieve it?
Currently I am going into each directory and manually checking in each file.
I would recommend this question:
Now the problem is to checkin everything that has changed.
It is problematic since often not everything has changed, and ClearCase will trigger an error message when trying to check in an identical file. Meaning you will need 2 commands:
ct lsco -r -cvi -fmt "ci -nc \"%n\"\n" | ct
ct lsco -r -cvi -fmt "unco -rm %n\n" | ct
(with 'ct being 'cleartool' : type 'doskey ct=cleartool $*' on Windows to set that alias)
But if by "checkin" you mean:
"enter into source control for the first time"
"updating a large number of files which may have changed on an existing versionned directory"
I would recommend creating a dynamic view and clearfsimport your snapshot tree (with the new files) in the dynamic view.
See this question or this question.
the clearfsimport script is better equipped to import multiple times the same set of files, and automatically:
add new files,
make new version of existing files previously imported (but modified in the source set of files re-imported)
remove files already imported but no longer present in the source set of files.
make a clear log of all operations made during the import process.
:
clearfsimport -preview -rec -nset c:\sourceDir\* m:\MyView\MyVob\MyDestinationDirectory
did you used -recurse option in the clearfsimport command.
Example: clearfsimport -recurse source_dir .
This should help.
If you're using the Windows client, right-click on the parent folder, select Search, leave the file name field empty, click Search, select all the files in the result window (ctrl-A), right-click on them and select ClearCase -> Add to Source Control
If you are in windows you may try,
for /f "usebackq" %i in (`cleartool lsco -cview -me -r -s`) do cleartool ci -nc %i