How do i use an SQL query to determine TFS Area name and build path including file name for installers - sql-server

I am trying to automate the release process associated with installers that have been created on my build server. To do this I was hoping to be able to use an SQL query to get the drop path of the installer including the file name.
In TFS when I go to "Build" and select "Artifacts" I can then use the "Explore" link to get the root path for the build. The subsequent folders "Installer\Disk" path is part of the configuration. However, the actual setup file is composed of the + " setup.exe". Since there are multiple projects in our TFS I was hoping to use a query to find all builds that have a build quality set to "Release" and dynamically find the installer on disk.
Generally our installer names are made up of the root area path name with all text removed. I can't figure out how to connect the build to the root area path in SQL.
Any ideas?

Generally, Accessing the information from Database directly is not recommended since it is in a high risk. I would recommend you to use TFS API to do this.
The drop location of the build is stored in TFS Collection Database\tbl_Build table and the quality information is stored in tbl_BuildQuality table. Join these two tables to query the information you want.

Related

What are the project settings in SSRS deployment?

I have successfully created my first SSRS project in Visual Studio. The deployment process requires setting up the TargetServerURL and the TargetServer Version. These are the only two items that I know are correct. The tutorial I have been watching does not go into the other items and does not clarify what they are and what they are used for. What are the following items referring to?
TargetDatasetFolder
TargetDataSourceFolder
TargetReportFolder
TargetReportPartFolder
The default settings for OverwriteDatasets and OverwriteDataSources was False and this is probably why my deployment attemtp threw a nondescript error. So, now, perhaps if I try again, my deployment will create these folders on the server by force, but I owuld rather not do this because the database manager has already given me the names of the folders where I should deploy. So, how are these Folders arranged? Please advise.
TargetDataSourceFolder: The name of the folder in which to store the published shared data sources. If you do not specify a folder, the data source is published to the same folder as the report. If the folder does not exist on the report server, Report Designer creates the folder when the reports are published.
TargetDataSetFolder: the same but for your shared data set you want to publish.
TargetReportFolder: The name of the folder in which to store the published reports. By default, this is the name of the report project. If the folder does not exist on the report server, Report Designer creates the folder when the reports are published.
You can write a path (finance/dept1/...) in this case, you'll deploy your report (or datasets or datasources) following this path.
Here is an exemple by default from microsoft:
About 'overwrite dataset' and 'overwrite datasource' (it's about 'shared dataset' and 'shared datasource') it depends on the architecture you chose (or if you have already something created) on the server.
I think the best way is to let them as False. If they don't exist, the deployment will create them. If they exist, you'll just get a warning (if I remember) and the report you'll be deploying should link your report to those dataset and datasource already created. Futhermore, probably you have other reports linked to those shared datasource/dataset and if you overwrite them, you'll probably raise some issues when you'll run those other reports. You have to put 'True' when you want to modify the dataset/datasource

changing default data directory for sql server after installation

I have mistakenly put my SQL data directory in the wrong folder and as such, the system DBs are located in the wrong directory, and hence, I want to move it to a different directory. I am going to move the data and log files for all my system DBs but I would like to know if it will move all the folders under MSSQL or do I need to perform some other steps as well? Please find below the folders that I can see under MSSQL.
In order to move system databases, not only you should move the files, but prior to it you should change their paths in system tables by using
ALTER DATABASE..MODIFY FILE
When moving master database you should also change the startup parameters in Configuration Manager, you should put there the new location of master files.
All this is described in Move System Databases article.
Note that if one day you'll need to rebuild your master, it will be put in the old location, so you cannot just get rid of DATA folder. And after rebuilding you'll need to move your system databases again.
This is to let you know the solution to my problem. I have successfully changed the data root directory by moving the data and log files for all the system DBs to the desired location.I also had to change the registry value for the SQLDataRoot directory to my desired location and modify the location of the server diagnostics file as per my cluster requirement. After all this, I have been able to successfully move the folder to the desired location. Thanks for all the help everyone.

Database Project Over Multiple Environments?

How can you have environment specific table values in your database project and make sure that they only deploy to the environment you are deploying to with Release Management? We have been using Release Management for some time now, but only for .NET code. We are somewhat new to the DACPAC realm, but have found it easy to set up and use via release management. However, now we want to extend this capability to a table that has configuration variables per environment. How do we make this part of our database project and make sure that each environment has its own unique version of data?
Use SSDT for publishing the database schema and reference data; don't use it to manage environment settings.
Personally, I would (and have) run a secondary script post-deployment that configured environment-specific values. This is no different than putting the correct values in the web.config file of a web application post-deployment. It's something you manage within your deployment tool.
Ignoring the release management part to the question (because it depends what mode you use and whether you store configuration variables in RM etc) you can certainly pass in environment specific values into your dacpac execution (for use in 'postdeploy' data scripts) using sqlcmd variables defined in a tokenised publish file.
Broadly the process is:
Use standard sqlcmdvar syntax in your post deploy script e.g insert into table values '$(my_env_var)'
update the database project properties (sqlcmd tab) to include your new variable which ensures your dacpac expects a value when executed
Generate a publish.xml file (which should now include a node)
create a publish.release.xml file which contains transform instructions to update the value of your node to introduce a token e.g. ##my_env_var##
update your database project file(.sqlproj) to include instructions to transform publish.xml on build using the contents of publish.release.xml
Its quite long winded but what you get out of the above is a dacapac + tokenised publish file in your build output ready to be detokenised and executed by your deployment process..be that RM or any other tool.

Scripting FileTables

Scenario:
I have 3 environments that I am using, Dev, UAT and Live. Each of which having it's own database, MyDb_Dev, MyDb_UAT, MyDb_Live.
Then I have a VS2012 Database project in my solution that contains all my scripts. This works nicely when I make changes to my model database (MyDb_Model) that is located locally.
What I want to do:
I want to use the FileTables in SQL 2012 (which I understand how to set up), however I don't know how to script them to be able to configure the options to handle my environments. When I generate the scripts, it will hard code the name to be MyDb_Model as the FileGroup. Also, that said, when I do try and publish to my Dev database, it's complaining about the database options not being able to take the new scripts. When I script include the options of the Model database, it'll complain when I try to publish to my Dev database because of duplicating names.
Question:
Can you script FileTables (with the database options) using the database project in V2012 to be configurable or do I need to manually make my own scripts?
Prefered:
Compare MyDb_Model to Database project.
Publish to MyDb_Dev as a newly created database.
Sounds like you'll want project variables to handle this where the variable contains the environment-specific text for each. You'd then use that variable in your objects instead of the hard-coded paths. The following would create a FileTable called "DocumentStore" and use the value for a variable called "FileTableDirectoryVariable" that you set up in your Project Properties - SQLCMD Variables. Set each of those in your Publish Profiles to use the correct directory, and you should be good. If you're using different filegroups for these tables, you should be able to tweak the FileGroup setting in a similar manner using a SQLCMD Variable.
CREATE TABLE DocumentStore AS FileTable
WITH (
FileTable_Directory = '$(FileTableDirectoryVariable)',
FileTable_Collate_Filename = database_default
);
GO

Recover postgreSQL databases from raw physical files

I have the following problem and I need to know if thereĀ“s a way to fix it.
I have a client who was cheap enough to decline buying a backup plan for his postgreSQL databases on the main system that runs his company and as I thought it would happen some day, some OS files crashed during a blackout and the OS needs to be reinstalled.
This client didn't have any backups of the databases but I managed to save the PostgreSQL main directory. I read that the databases are stored somehow inside the data directory of the postgres main folder.
My question is: Is there any way to recover the databases from the data folder only? I am working in a windows environment (XP service pack 2) with PostgreSQL 8.2 and I need to reinstall PostgreSQL in a new server. I would need to recreate the databases in the new environment and somehow attach the old files to the new database instances. I know that's possible in SQL Server because of the way that engine stores the databases but I have no clue in postgres.
Any ideas? They would be much appreciated.
If you have the whole data folder, you have everything you need (as long as architecture is the same). Just try restoring it on another machine before wiping this one out, in case you didn't copy something.
Just save the data directory to disk. When launching Postgres, set the parameter telling it where the data directory is (see: wiki.postgresql.org). Or remove original data directory of the fresh installation and place the copy in its place.
This is possible, you just need to copy the "data" folder (inside the Postgres installation folder) from the old computer to the new one, but there are a few things to keep in mind.
First, before you copy the files, you must stop the Postgres server service. So, Control Panel->Administrative tools->Services, find Postgres service and stop it. When you're done copying the files and setting permissions, start it again.
Second, you need to set the permissions for the data files. Because postgres server actually runs on another user account, it will not be able to access the files if you just copy them into the data folder, because it will not have permissions to do so. So you need to change the ownership of the files to the "postgres" user. I had to use subinacl for this, install it first, and then use it from command prompt like this (first navigate to folder where you installed it):
subinacl /subdirectories "C:\Program Files\PostgreSQL\8.2\data\*" /setowner=postgres
(Changing ownership should also be possible to do from the explorer: first you must disable "Use simple file sharing" in Folder options, then a "Security" tab will appear in the folder Properties dialog, and there are options there to set permissions and change ownership, but I wasn't able to do it that way.)
Now, if the server service can't start after you start it manually again, you can usually see the reason in the Event viewer (Administrative tools->Event viewer). Postgres will throw an error event, and inspecting it will give you a clue about what the problem is (sometimes it will complain about a postmaster.pid file, just remove it, etc.).
The question is very old, but I want to share an effective method that I found.
If you have not got a backup with "pg_dump" and your old data is folder, try the following steps.
In the Postgres database, add records to the "pg_database" table. With a manager program or "insert into".
Make the necessary check and change the following insert query and run it.
The query will return an OID after it has worked. Create a folder with the name of this number. Once you have copied your old data into this folder, the use is now ready.
/*
------------------------------------------
*** Recover From Folder ***
------------------------------------------
Check this table on your own system.
Change the differences below.
*/
INSERT INTO
pg_catalog.pg_database(
datname, datdba, encoding, datcollate, datctype, datistemplate, datallowconn,
datconnlimit, datlastsysoid, datfrozenxid, datminmxid, dattablespace, datacl)
VALUES(
-- Write Your collation
'NewDBname', 10, 6, 'Turkish_Turkey.1254', 'Turkish_Turkey.1254',
False, True, -1, 12400, '536', '1', 1663, Null);
/*
Create a folder in the Data directory under the name below New OID.
All old backup files in the directory "data\base\Old OID" are the directory with the new OID number
Copy. The database is now ready for use.
*/
select oid from pg_database a where a.datname = 'NewDBname';
As shown by move database to another hard drive. All we need to do is to modify the registry table and file permissions. By modifying registry table(shown in image 1), postgresql server know the new location of data.
modify registry
If you have issues with permissions or with stuff like icacls during installation to old data folder then try my solution from sister website.
https://superuser.com/a/1611934/1254226
I do so but the most tricky part was to change the owner permission:
go to services from administative tools
find postgres service and double click on it
at log on tab change to local system
then restart

Resources