Sending a merge request through TFS 2008 - branching-and-merging

I am wondering if TFS 2008 is capable of sending an automatic notification to a developer when a merge request for a specific bug, change request, etc is made?
Any suggestions?
Thanks,
nino11

TFS Power Tools is the most detailed you can be with project alerts.
You cannot however send a notification when simply the request was made. You can however send a notification as soon as a merge takes place (IE after it completes). You can use the "Check in to specific folder" type of project alert.
I would install TFS Power Tools and take a look at all the features it has. If its not there- you can't do it- at least not to my knowledge in TFS 2008.

Related

Dynamics CRM Publishing Customizations - Multi Developers

In my company we are 5 developers who work on same organisation.
We used a single server with SQL and Dynamics CRM installed.
If two developers publish customizations at the same time, we need to restart IIS.
Do you have any suggestions to optimize SQL Server or IIS to avoid this issue?
Currently in the way CRM works, publishing will slow everything down and potentially lock the SQL server up.
Your options would to be to only publish when necessary, so you minimize your disruption to your developers. For stuff like form design, I would recommend not publishing, but, to preview the from to see your new changes.
Upgrading your servers with more memory and so on will help reduce the time taken to publish. Sorry this is not much help.
If you are developing webresources, you can use this little hack webresource proxy as workaround. Basically, it just replaces the content of the webresource by the response of generic web server. You don't need to update and publish webresource anymore.

BizTalk with Visual Studio - Steps to query SQL Server?

Does anyone know of a source (article, blog, tutorial, video, ...anything) clearly describing the simplest / best practice steps required to build a BizTalk application (part) with VS being able to query a SQL Server Database table or view?
I already have a very simple but functioning BizTalk 2013 solution in Visual Studio 2012, and now I'm trying to integrate it with SQL Server 2012 the most simple way imaginable - i.e. trying to build up an orchestration with a receive port querying (either via polling or via query notification) a very simple SQL Server view.
Of course, I have read msdn articles, all of which just describing a tiny-little part of the whole process, never getting the whole picture throughout them.
Of course, I've also googled and searched stackoverflow around, never finding anything like that.
I'm still not sure whether you need 5 or 50 steps to achieve that (simple) goal.
This is what I could figure out. Unfortunately, this is quite a complicated process involving overall 12 steps.
You are welcome to show me a better / simpler / nicer solution, which I accept then instead of mines immediately. After all, this is why I'm here.
EDIT: + deploy + further details
Steps to query a SQL Server (2005 or above) database table / view in a BizTalk 2013 application developed in Visual Studio 2012 - the simplest / best practice approach:
Install the BizTalk Adapter Pack. Normally, it's not installed with BizTalk together, so you have to install it separately.
Ensure that the Service Broker is active in your SQL Server.
Ensure that your BizTalk application has the necessary permissions to request notification in your SQL Server database. See: Enabling Query Notifications.
In your BizTalk project in Visual Studio, generate a WCF-SQL BizTalk adapter notification schema via the Add Adapter Metadata Wizard and configure it as described here: Processing Notification Messages to Perform Specific Tasks. For the notification, you are only allowed to query a table - even if you need the result set of a view. In that case, query (one of) the underlying table(s) here.
Add a notification message of the just generated notification schema to your just generated orchestration.
Add a receive port of the just generated port type and a receive shape to the orchestration to receive the notification messages. And there you have your SQL notification message in your orchestration.
Generate a WCF-SQL BizTalk adapter select schema via the Add Adapter Metadata Wizard similarly as in step 4. Don't choose any existing adapter in this step, and don't configure any binding - just take the URI you created in step 4 with another inbound ID. Select the view / table you want to query and the select operation from the available ones.
Add a select query and a select response message of the just generated respective schemes to the orchestration.
Add a send-receive port of the respective send and receive shapes to the orchestration to send out the select query messages and receive the result messages back. And there you have your SQL query result in your orchestration. Process it as you like.
Build and deploy your BizTalk project in Visual Studio.
Configure your just deployed BizTalk application via BizTalk Server Administration. As a first step, add the WCF-SQL adapter to the list of adapters in your BizTalk Server. It's a good idea to export your binding configuration here, once you have fought your way through it, so that you can later import it back if you loose it e.g. because of a new deployment.
Start the application.
And there you have your running BizTalk application querying a SQL Server database table / view.

Sql Server 2008 Reporting Services email on the fly solution

To all,
I have noticed that other reporting tools allow you the option, at the time of running a report from the web interface, to either have it rendered to the browser or allow you to enter an email address have have the report sent to that address. This would be helpful for long running reports or reports that are fairly large.
My question is whether this can be done with the existing sql server 2008 report server toolset or if there are third part solutions available?
Thanks.
--sean
I don't think that what you are wanting to do is possible out of the box.
This may seem like overkill for your situation, however, I have worked for a client who wanted some custom features like this. Given that Report Manager is so inflexible out of the box, we wrote a new front end leveraging the Reporting Services Service. We could then write our own extended capabilities right into the new viewer.
This link describes it a bit more.
http://msdn.microsoft.com/en-us/library/ms159218.aspx
You can setup a subscription on a report which will email it too you once or at regular intervals.
This Link give you more info. Be aware that if you want data driven subscriptions you need enterprise sql server.

Sending infopath forms via email (as attachment) to be parsed by SQL Server 2005?

Just looking at the requirements of a new project and I wanted to make sure this use case was sound:
user fills in InfoPath (2003) form locally on their PC
a button within the InfoPath form titled 'submit' brings up a new outlook (2003) email message with the infopath form attached. User presses sends and email is sent to an exchange mailbox.
sql server preiodically checks this mailbox, downloading any new submissions with the infopath form attached
sql server parses the attachment and the fields within the infopath form.
Is SQL Server capable of parsing mail attachments this way? Any caveats with this approach?
The attraction to using Outlook as the submission technology is that the process for the user is the same if they are offline. Outlook will then automatically sync when they come back online. It is essential that users have some way to fill the forms in offline, 'submit' them, and then have then synced automatically with the server when they next come online.
edit: to clarify, I am not looking for a way to cache form data from the server->client. I am looking to cache the completed form. Building a separate application to cache the completed reports on the client is not an option.
Later versions of SQL Server are capable of running .NET code within them, and as such you might be able to poll a mailbox from SQL Server and process an InfoPath form. However, I'm not sure I'd do it this way.
It might be better to consider writing a Windows Service that does this work. The Windows Service would start up, inspect the mail box of a "service account", read the mails, extract the attachments, process the xml and, eventually, write the data to SQL. It could also, presumably, respond to that mail with a confirmation or errors if business rules or validation errors occurred.
I'm not sure I'd put all of the above logic into SQL - for one thing, I suspect you'd have issues with accounts (having to have the account SQL was running under be able to access the Exchange mailbox account).
Your mileage may vary, and you should prototype this to determine what works best for you, but I'd try and keep the code the uses Exchange as a "work queue" separate from SQL and only put the code that deals with writing data into tables in SQL.
I would not use the approach you outlined above. There are several approaches that appear to me to be more desirable than having SQL Server looking at an Exchange Mailbox. The major point that you make and an important requirement is that the InfoPath form be allowed to work in offline mode. I would think of the "offline mode" and the "data transfer" parts of your project as two distinct and separate pieces: 1) The form and the data should be stored on the client until a connection to the Internet is available and 2) once the connection is available the form and data should be transferred to the server.
You can setup your InfoPath form to submit directly to the SQL Server and bypass the Exchange "middleman" entirely. The setup in InfoPath when you are designing your form is pretty straight forward: 1) you enable "Submit data" for the connection and 2) you configure the submit options. This article has the details about how to do that. Furthermore, your connection to the SQL Server may be setup for offline use, as it's discussed in this article. The only caveat with this approach is that you may need to change your database schema to support it.
Another approach is to have your InfoPath form submit to a SQL Server 2005 HTTP Endpoint. The InfoPath client is just a glorified XML Editor and the HTTP Endpoint is basically a different name for a web service. You receive the form data at the HTTP endpoint into a staging table where the data is stored as XML and then you can do your parsing of that data from that staging area. Still you will have to setup the InfoPath connection for offline use. The major caveat with this approach is that Microsoft will deprecate HTTP Endpoint in SQL Server 2008 in favor of WCF.
And the other approach I would like to suggest is to use WCF itself to receive the XML form data from the InfoPath client. This approach would require you to connect the form's data source to you WCF web service at design time and then also setting up the form for offline use.
I hope that this will be useful to you and at the very least point you in the right direction.
I've seen similar projects that resorted to an Express edition on the client, save the infopath (or app data) in Express and use Service Broker to deliver to center, because of guaranteed delivery semantics of SSB vs. mail. This gives you an all SQL solution easier to sell to IT and you don't need polling on the server. Also you will not have to deal with MIME parsing, is all straight forward XML processing. It is not for the faint of heart though, getting SSB up and running is a challenge.
If you decide to go with mail delivery, an external service will be arguably easier to build, debug and troubleshoot. There are some finer point issues you should have an answer for:
-How will you keep consistent the mail dequeue operations and the table write operations? Your component must engage the Exchange read/delete and the SQL insert into one distributed transaction.
- Is your logic prepared to deal with infopath docs coming out of order? mail transport makes absolutely no guarantee about the order of delivery, so you may see an 'order delete' doc before the 'order create' doc
- How are you going to detect missing documents (not delivered by mail)? Are you going to implement a sender sequence number and end up reinventing TCP on top of mail?
- Does your processing allow for parallel process of correlated documents? If thread 1 picks up doc 1 and thread 2 picks up doc 2 from same sender and doc 2 is correlated with doc 1 (ie. refer to same business transaction), what will happen at the database write? Will it deadlock, will it loose an update, will one be rolled back?
There are many dragons under the bridge ahead...

Microsoft Sync Services - good solution for me?

We upload sales transactions from our stores to the headoffice server. At the moment, we use DTS (SQL Server Data Transformation Services), but we’re planning on replacing that with Microsoft Sync services for ADO.NET, as this seems to be Microsoft’s preferred solution for this type of setup and we want to follow the standard (that will be hopefully be around for a long time).
Here are the details of our setup and what we’re planning. I’m looking for some advice, especially about whether Sync Services fits into our solution.
Situation
Each store has a 3rd party EPOS system which stores sales in a Microsoft Access 2000 database, which we can access. Our headoffice database is SQL Server 2005, but will be upgraded to 2008. The headoffice is not on a VPN with all the stores, but we can open up our firewall to the stores’ IP addresses, so that they can send data directly to SQL Server. The stores are always connected to the internet via ADSL, although they do lose connection and we don’t want to lose sales data.
We are only uploading transactions from the store – definitions do not need to be downloaded.
Current solution
We have written a Windows service that runs on the store PC. This service downloads a DTS package from the server (which contains all the details of the upload) and runs it in the store – and this will upload sales to our server.
We chose DTS, because it is free when you install MSDE. We can’t use SSIS, because that would require a SQL Server licence at every store.
Another reason we chose DTS is that the details of the upload (i.e. which tables and fields to include) are stored on our headoffice server, so if we need to change things we can do that centrally and don’t need to install anything new at the stores. This isn’t a showstopper, but would be nice to have this ability in our new solution.
Potential solution - Microsoft Sync services for ADO.NET
We are currently building a proof of concept with Microsoft Sync services for ADO.NET. The idea is to put SQL CE (SQL Server Compact 3.5) in each store (client) and sync that to the headoffice SQL Server 2005 database (server). We’ll get the data into the SQL CE database either by (1) syncing it with the Access 2000 database or (2) getting the EPOS system developers to write sales straight into the SQL CE database – probably (2). But our main concern is getting the data from the store to the headoffice server. This method seems to be Microsoft’s preferred solution for occasionally connected systems and that is what made us look seriously at Sync Services.
I’m hoping that using this will mean that most of the work needed to upload the sales will be built into Sync Services and we won’t have to re-invent the wheel.
Potential solution - Upload to a custom webservice
There is also the possibility of uploading the sales transactions to a custom web service on our headoffice server and then into our SQL Server database. This means that we will have to build our own mechanism for determining which rows are new, and as well as caching for when the systems are disconnected. Also, we might be missing out on other functionality that will come built into Sync Services.
Please let me know if you have any advice that will help, especially : “Is Sync services the right solution!”. The problem that we are trying to solve seems very generic (uploading sales from stores) – and I’d like to solve it with a generic solution.
Microsoft Sync services is more than you need, but it will certainly do what you want, and it was built with your type of application in mind.
As with most new technologies out of microsoft, (caution: generalization!) you may find that it's not as mature as you might like. It'll do what you need it to, but you may run into issues that aren't easily resolved because it hasn't been put through the ringer. As an early adopter, though, you may find that the Sync developers are eager to help you out when you get stuck, so this isn't as big a problem as it might seem.
Make sure you read through all the literature on it, some of which is here, or linked in the following sites:
http://msdn.microsoft.com/en-us/sync/default.aspx
http://msdn.microsoft.com/en-us/sync/bb887608.aspx
http://en.wikipedia.org/wiki/Microsoft_Sync_Framework
Given your one-way flow of information, though, and centralized layout I expect you should have few, if any, issues setting it up and using it.
Be sure to report your experience back here!
-Adam

Resources