Project has taken so long to decide to upgrade Clearcase from 7.1.2.6 that we 8 going out of support in less than a year we now looking at CC version 9.
Does anyone have any experience or knowledge if potentials pitfalls or issues doing this.
We are using UCM CC with CQ and have seen one post describing an update in how this can work but is there anything else any one has encountered?
Common stumbling blocks include:
VOB schema version. Since you're on 7.1 already, we at least know your VOB schema is supported.
Feature level. If any of your VOBs are not at least feature level 6, you need to update them to FL6 before upgrading CC. See About feature Levels and ClearCase for more on the available feature levels.
Platform support. Between 7.1.x and 9.0, a number of older platforms were explicitly desupported, among them XP, Windows 2003, Solaris 10, Red Hat Linux 4 & 5, etc. You may have to update the OS, which may require a hardware update. You should review the ClearCase 9.0 Detailed System requirements to verify whether your server and client hosts need to be updated in order to support the new release.
Integration changes. If you are using older development tools, they may no longer integrate with the current ClearCase release.
User Interface changes. ClearCase 9 includes an Eclipse-based client called ClearTeam Explorer which has some, but not necessarily all, of the functionality from CC Explorer, Project Explorer, the vtree browser, and some merge tools. Those older tools still work, largely the way they did before warts and all.
There is a lot of work in 9.0 to make ClearCase more "thread-friendly" to allow for multithreading some server processes in subsequent releases. Some of this (view and albd improvements) should show up fairly soon in 9.0.1.
As with any large system migration, setting up a test environment would be recommended to locate any issues with tool integrations before rolling it out in production.
Related
When submitting Excel add-in to the office store. Which version of the Excel API should be referenced in the manifest file?
We have experienced being rejected because we didn’t refer to the newest version of the Excel API.
But if our Excel add-in supports an older version of the API. Shouldn't we be referencing this?
There are several aspects to the versioning of the Office.js library.
First, there is versioning of the actual JavaScript source files. Fortunately, this part is pretty simple: you always want the latest and greatest of the production Office.js, which is conveniently obtainable through our CDN: https://appsforoffice.microsoft.com/lib/1/hosted/Office.js. The files are also shipped as a NuGet package to allow corporate firewalled development, but the NuGet may lag a few weeks behind the CDN -- and in any case, any Store-bound add-in is required to reference the CDN location. So, in terms of Office.js versions, there isn't really versioing there: there is simply the one and only evergreen, frequently-updated, always-back-compatible, Store-required CDN version.
(While on the subject of CDNs: we also have a beta CDN, available at https://appsforoffice.microsoft.com/lib/beta/hosted/Office.js. This one is great to testing newly implemented-but-not-yet-officially-stamped-as-done features, which you'll find in our Open Specs: http://dev.office.com/reference/add-ins/openspec. However, any new APIs therein should be considered as strictly "beta", and they may well be renamed, re-grouped, or postoponed -- so your app should not rely on them, as we explicitly reserve the right to break pack-compat on the beta branch. An API is not "done" until it is listed in IntelliSense ande Documentation as complete, until it's available on the Production CDN, *and until its isSetSupported returns true -- more on that momentarily).
The more interesting bit for versioning are the actual API capabilities that are offered by each host. The Office.js library will have the latest JS code to be able to run them, but older hosts might not be able to support some of the functionality. For example, if you look at https://dev.office.com/reference/add-ins/requirement-sets/excel-api-requirement-sets, you will see that the 2016 wave of Excel APIs -- grouped under the "ExcelApi" requirement set -- has had three versions to date: 1.1, 1.2, and 1.3. ExcelApi 1.1 was what shipped with Office 2016 RTM in September 2015; 1.2 shipped in early March 2016; and 1.3 shipped just recently, and is in the process of being rolled out to the CDN. Each API set version has a corresponding Office host version that supports this API set (for most APIs, there have to be both JS and host-side changes; it's fairly rare that something can be a purely-JS addition). The version numbers are listed in the table, and there are links below the table to find a mapping from build numbers to dates.
Each of the API set versions contain a number of fairly large features, as well as incremental improvements to existing features. The topic for each requirment set, such as the link above, will provide a detailed listing of each of those features. And as you're programming, if you are using the JavaScript or TypeScript IntelliSense, you should be able to see the API versions for each of your APIs displayed as part of the IntelliSense:
You can use the requirement set in one of two ways. You can declare in the manifest that "I need API set ExcelApi 1.2, or else my add-in doesn't work at all" -- and that's fine, but then of course you aren't able to service older hosts, and so your add-in won't even show up there. Alternatively, if you add-in could sorta work in a 1.1 environment, but you want to light-up additional functionality on newer hosts that support it, you can use the manifest to declare only your minimal API sets that you need (e.g., ExcelApi 1.1), and then do runtime checks for higher version numbers via the isSetSupported API. Neat ways to get environment (i.e. Office version) has a very detailed explanation of this isSetSupported API-checking.
Hope this helps!
I'm in the process of upgrading an existing application to .NET Core (DNX SDK 1.0.0-rc1-update2) that uses SQL Servers FILESTREAM feature for reading/writing large BLOBs to the database. It uses the SqlFileStream class to achieve this however it doesn't appear to be available in .NET Core. Here are my references in project.json:
"frameworks": {
"net451": {
"frameworkAssemblies": {
"System.Runtime": "4.0.10.0",
"System.Collections": "4.0.0.0"
}
},
"dotnet5.4": {
"dependencies": {
"Microsoft.CSharp": "4.0.1-beta-23516",
"System.Data.Common": "4.0.1-beta-23516",
"System.Data.SqlClient": "4.0.0-rc2-23623",
"System.Collections": "4.0.11-beta-23516",
"System.IO.FileSystem": "4.0.1-beta-23516",
"System.Linq": "4.0.1-beta-23516",
"System.Runtime": "4.0.21-beta-23516",
"System.Threading": "4.0.11-beta-23516"
}
}
}
I've tried searching SO and Google, both of which have absolutely nothing on the subject.
Can someone please confirm if its actually unavailable or if its in another package I'm unaware of?
I realize the question is old, but I just came across the issue - implementing SqlFileStream - listed on the github repo for CoreFX (.NET Core foundational libraries) and thought I'd mention it here. Here's a link to the issue, for reference: https://github.com/dotnet/corefx/issues/15652
To recap: The issue is implementing SqlFileStream. It's currently an open issue, but not on the horizon anytime soon. One of the contributors states "if there are any Windows specific dependencies, we may not bring it in Core."
I've actually been interested in this for a while and have taken some time over the last few days.
Unfortunately, FILESTREAM uses several NTFS and NT specific system calls (NtCreateFile, DeviceIoControl in particular, but a few others to support those as well) to manage access to the file. Also, unfortunately, as of this writing the latest MSSQL CTPs for Linux don't support FILESTREAM, and there's little clarity as to whether that's on the roadmap or where it might be (strangely, you can restore a database that supports FILESTREAM but FileTable doesn't seem to be supported).
There are two problems here: it's not clear that replacing the NT specific APIs would respect transactional integrity (or even work at all), and it's not clear that they could ever work from a non-Windows environment anyway. Because of these facts, I don't see SqlFileStream being supported any time in the near future for core.
There is some precedent for Windows Only type of low level, for example in System.Net.Socket.IOControl. SqlFileStream could perhaps take a similar path. Alternatively, it might be possible to build a specific SqlFileStream NuGet package, but have it only be supported/runnable on Windows. I'm not sure how valuable this would be though - if you're going to P/Invoke in a Windows only way to begin with, why not just P/Invoke to a .NET 4.6.x dll?
Cross posting this to the github issue: https://github.com/dotnet/corefx/issues/15652
Edit: As an alternative to P/Invoke, you could certainly create some other kind of service (RESTful, WCF, some other pipe or TCP or even memory mapped file) in .NET 4.x for a .NET Core library or application to access.
I cannot access ANY database by ANY means from within Delphi XE Professional. What I mean by accessing the data base is:
having the live database appear via
components in the Object Inspector,
when the connected property is set
to true
using the Data Explorer to create
and explore database connections
EDIT:
SORTED !!
The core problem is that communication into the database, including specifically the communication generated by the IDE and any code built using the drivers was problematic.
Confounding and masking sub-issues were:
Missing or misplaced DLLs.
The Data Explorer does not fully support dbExpress drivers.
There is a bug within the 2009 IDE code, found by Chee-Yang Chau when writing the dbxFirebird driver, which limits static linking of drivers into Delphi. It is not known if this bug extends to 2010 or XE.
When using the Object Inspector, it is easy to cause the IDE to revert changed connection parameters to their default values.
Some drivers had incorrect default values (eg assuming the client dll was always gdb32.dll irrespective of whether the database was Interbase or Firebird).
Installation of two versions of Interbase led to some clashes in database communication - server names generated by the tools were odd; and the view of the databases depended on which installation of the Interbase tools were used.
The documentation available is of varying dates; refers to different versions; and as a result often appears contradictory.
END EDIT:
Approaches tried:
Multiple databases
Multiple different drivers/components
Accessing the database through other external tools, such as IBSQL and Flame Robin.
Raising questions (here and here) on SO.
Raising questions on the support forums for Firebird, Embarcardo, and Flame Robin.
Environment:
OS: Windows 7 Ultimate 64bit:
Delphi Embarcadero® RAD Studio XE Professional Version 15.0.3953.35171
Database: W1-V2.5.0.26074 Firebird 2.5 (64 bit)
Connection technology: dbExpress
Delphi Professional does not support Firebird with the native dbExpress drivers that come with Delphi. You need the Enterprise or Architect version of Delphi in order for the native firebird dbExpress driver to work.
I have Delphi 2010 Professional. I didn't want to spend the extra money on the E or A version and I failed to read the feature matrix to see that the Firebird dbExpress driver is not available with the Pro version.
I have found a few really nice videos that show how to connect to Firebird using Delphi. However, when I tried to follow along with my Professonal version nothing worked.
Shame on me and more shame on Embarcadero for touting that Delphi supports Firebird in big bold print but not mentioning that you need the Enterprise or Architect version except in the tiny fine print.
I can now write code to access Firebird within the IDE. I have (limited, but sufficient) access to the drivers within the IDE. Specifically, the drivers appear in the Data Explorer, which can be used to generate default values for the SQLConnection (dbExpress component). These can be accessed and used within the Object Inspector. The workaround to the IDE bug quoted below is necessary to ensure the communication parameters are correct. When writing database code, it is necessary to compile in the source for the dbExpress driver.
The following code is the minimum, with minimum parameter set, necessary to establish and test a database connection:
unit Unit2;
interface
uses Classes, SqlExpr, Dialogs, dbxDevartInterbase;
var SQLConnection1 : TSQLConnection;
implementation
{$R *.dfm}
begin
SQLConnection1 := TSQLConnection.Create(nil);
with SQLConnection1 do
begin
ConnectionName := 'TestConnection';
DriverName := 'DevartInterBase';
LibraryName := 'dbexpida40.dll';
VendorLib := 'fbclient.dll';
GetDriverFunc := 'getSQLDriverInterBase';
Params.Clear;
Params.Add('User_Name=SYSDBA');
Params.Add('Password=masterkey');
Params.Add('Database=localhost:C:\Program Files\Firebird\Firebird_2_5\examples\empbuild\employee.fdb');
Open;
If Connected then ShowMessage('Connection is active');
Free;
end;
end.
The workaround, courtesy of Bob Swart on one of the Codegear forums is:
The trick is to select a
ConnectionName value, which will then
assign a value to the Driver property
and all other properties like
LibraryName, VendorLib and
GetDriverFunc.
Then, make changes - if needed - to
the subproperties of the Driver
property, and finally clear the name
of the Driver property.
This will leave all your changes in
the Params list (which you can also
manually edit if you wish).
Note: leave the ConnectionName set -
if you clear that one, the parameters
will be cleared again.
Now you can compile your application
and deploy it without the need for
dbxdrivers.exe or dbxconnections.ini
(but you need to deploy the DLLs
specified in the LibraryName and
VendorLib, of course).
Also make sure to set LoginPrompt to
False and leave LoadParamsOnConnect
set to False, too.
BOUNTY AWARD
I have awarded the bounty to this answer as it was the one that pointed me away from investigation of the IDE, its installation and configuration, to investigation of the connection into the database.
END
DK about Firebird 64 bit - -no experience. But I've always had a lot of trouble with dbExpress. Never any problem with the included IB components suite. But there is a lot of confusion with IB versions...
But IMO you're best served using the ADO ('DBGo') components as opposed to any proprietary IB or Delphi specific drivers. What you need is an ADO provider for IB, available #:
http://www.ibprovider.com/eng/ - and as others have said, avoid using localhost, use 127.0.0.1, or better still, determine the true IP address of your workstation (ping machine-name...) And are you sure you don't have some kind of firewall or intrustion protection that may be involved?
You also need to make sure that your IB connection is configured properly - local or TCP, and no, don't use quotation marks for your names, pwrds, etc. The error message you got seems to indicate that you're trying to connect via TCP and it's not properly configured. What happened between the time it worked and the time it didn't work? Shut down Delphi? Reboot the machine? Explain please....
No 'special permissions' are needed - you simply need to ensure that your database server and client are properly installed and configured. In terms of functionality you can do everything with the pro version - just that the drivers etc aren't included in the package.
Again, IMO go for ADO and you'll never look back.
HTH,
MNG
Have you tried Paradox via the Borland Database Engine (BDE) and related components: TTable TQuery TStoredProc TDatabase and TSession?
If memory serves me correctly, at least as far back as Delphi 3, the distinguishing factor between "professional" and the "higher-level" editions has been the type of database development "out the box".
In Delphi 1, the BDE was the only way to do out-the-box database development.
Delphi 2 permitted a custom database layer by abstracting parts of the database component hierarchy.
Delphi 3 Professional provided BDE and drivers for file-based databases and Interbase.
One level up (Enterprise?*) they provided BDE drivers for typical client-server database access: SQL Server, Sybase, Interbase, Oracle,... (and native drivers for Interbase)
Another level up (Architect?*) introduced multi-tier development with Midas. Unfortunately, Borland took a step back with Midas, because the multi-tier components were again hard-wired to the BDE. (This was resolved in Delphi 4.)
?* Please note, I may be mistaken about the exact naming of these editions. Around about that time I formed the opinion that Borland was merely coming up with "grander" names in order to charge more for features that didn't really offer as much benefit as the 'big-cheque-writing-CIOs' came to believe - leaving developers to deal with the fallout. (Yes, I have battle-scars from Midas I.)
Rant aside, the theory was....
If one embarked on entry-level database development, you would purchase Delphi ?? Professional. Develop your system against a file-based database or Interbase via the BDE.
If you later needed to scale-up: you would upgrade Delphi, purchase your chosen SQL RDBMS, switch your connectivity via the TDatabase component, and apply the few necessary tweaks.
NOTE: In Delphi 3, you could switch to Native Interbase (personally not recommended) or use third-party components for non-Midas development. From Delphi 4 up, ADO and DevExpress started receiving more attention and nowdays, the BDE seems to be pretty much forgotten.
Of course theory & practice seldom frequent the same pubs. However, with a few cautionary pointers, you should be able to develop a significant file based solution that can be upgraded relatively painlessly.
Keep your business logic out of the database. This is quite possibly the biggest and most frequently encountered error. Huge chunks of systems are often written in triggers and stored procedures, making it more difficult to maintain or migrate a system.
Avoid platform-specific database techniques. This should go without saying, but if you don't explicitly look out for them, you will encounter problems.
Particularly relevant to file based database systems, many support special locking mechanisms - avoid them! They don't scale well to large multi-user systems in any case.
Generating of artificial keys often varies by platform: Generators, IDENTITY columns, how you get the new value.
Plan your system for large volumes of data. Identify the high-transaction tables, and avoid using uncontrolled retrieval of all records. I'd also avoid the TTable in this situation - BDE does a lot of interesting background things with TTable, and behaviour can vary according to driver and platform.
Disclaimer: All this was a long time ago, so some of the details may be a bit sketchy.
Disclaimer2: I don't have any experience with Delphi XE specifically. I currently use D5 professionally, and D2009 in my personal capacity.
Since I've not done this before I am not sure if the way I am planning to do this is okay or is there a better way. Like using Windows Installer or Install Shield or Windows Installer XML (WiX) toolset. Any help would be great, as I have no clue.
We have a product and we ship new version every few months. So far we've only been rolling out complete versions i.e. Either Version 1.0, or Version 1.5, but no upgrade from 1.0 to 1.2 to 1.3 to .... you get the picture, right! So any customer that get version 1.0 cannot upgrade to version 1.2 or 1.3 or even the latest. They'll have to uninstall old version and install the latest version. This is not right, but thats what we could do until now. But we'd like to change it.
My plan is to have a install file with (Sql Scripts) for each upgrade path. Check the table in database that stores the version info and depending on it run different script to upgrade database.
My concern is that this method may not be scalable, once we have more than 5 or 6 different versions.
If you could point to any articles or books on this topic, that would help a lot too.
Also, could we use Windows Installer or Install Shield for this?
thanks,
_UB
We've been using DBGhost for a year or so now to keep our database under source control along with our codebase, and it makes this kind of thing dead easy. It's not just well thought through, but they've been using it to roll out their own code for years, so it's dead solid.
Your problem is a pretty common one, and I've had to deal with this kind of problem at my last job. There is another tool aside from the RedGate tool that may help you do what you need to do. It's a tool called DB Ghost. They explicitly address the versioning problem, and have a packager as well. I would suggest doing a trial of the DB Ghost product because they have some interesting claims concerning multiple version upgrades. This was taken from their FAQ (http://www.innovartis.co.uk/faqs/faqs.aspx):
Q: Our problem is going to be managing
data structure changes during
upgrades. Our product line is
Shrink-Wrapped, or downloadable from
the website. So when a user downloads
an upgrade, they can be upgrading from
a very recent version, with few
database structure changes, or the
upgrade may be from a very old version
with a multitude of structural
changes. One upgrade needs to manage
it all. The user would be offsite, so
we can't hold their hand. We have
users in Greece, Australia, Malaysia,
Norway, etc. How would DB Ghost, if at
all, handle updates in remote
locations?
A: The DB Ghost Packager Plus product was
design to specifically address this
issue as it can dynamically handle the
required updates to a target database
seamlessly.
I'm just mentioning this because our company is trying to do something similar and I was doing research on this tool.
Thanks,
Eric
Do you insist on doing it yourself, or could you see yourself committing and investing in a tool?
I really like the idea of Red-Gate's SQL Packager, which will "diff" your two database versions, and then create a SQL script, a C# project, or a stand-alone executable to upgrade from version 1 to version 2.
Not 100% how you'd be able to upgrade from 1.0, 1.1, 1.2, 1.3 all to 2.0 - check out their website and see if they offer something for that scenario!
Otherwise, I guess it'll get quite thorny and messy......
Marc
In the Rails world they are using a tool/method called Migrations.
Basically is boils down to creating a small sql script to upgrade and downgrade each little change to the database.
When you are testing the application you migrate your database to the version you want and on deployment the application can check what version it needs and migrate to that version.
There are free migration toolkits for most popular languages, they might be part of some MVC framework though.
A nice side effect of migrations is that you have database source code that is easily stored in you source control repository.
Can someone recommend a source control product that does all of the following:
Seamless integration into VS 2008 Pro
Will allow me to create different "editions" of a program (like "express" and "pro") - maybe with branching?
Will allow me to track changes for specific client requests. Say I have four clients, 2 on express, 2 on pro. I would be able to create specific, customized changes for all clients while still maintaining a singular codebase.
I'm not sure if something like VisualSVN can handle this, but there must be a product out there.
Virtually every source control will satisfy the #2 and #3 requirement with branches.
For #1 it's more tricky. If you really want a Seamless integration (capital S) then Team Foundation Server is your only choice. (It's very expensive)
Otherwise virtually all the major source control systems will have some sort of VS plugin, but the plugin usually doesn't work very well.
The two most popular free source control systems are:
Subversion
git
The best way to create different additions of your software using the same code in all of the different versions it to use pre-processor directives to conditionally compile your software based of flags that you set.
For information on conditional compilation please see the following links:
.NET: http://msdn.microsoft.com/en-us/library/9ae6e432%28VS.71%29.aspx
Java: http://weblogs.java.net/blog/schaefa/archive/2005/01/how_to_do_condi_1.html
C++: http://www.devarticles.com/c/a/Cplusplus/C-plus-plus-Preprocessor-The-Code-in-the-Middle/3/
I hope this answers your question I use this alot when developing different version of applications for different platforms.
An example of this is an application that I developed in c# for both a server and mobile device implementation. Each had different ways of calling functions in .NET libraries but the logic was the same so I used preprocessor conditional compilation to compile to correct code for each platform but leave the logic intact.
From experience you only need integration with Visual Studio if you need to check out the file before editing it (a-la SourceSafe) and the file is read-only until then.
Having used SourceSafe I went on to using SVN and absolutely never looked back. Then I switched to git and again never looked back on SVN or Sourcesafe.
I can't comment on Team Foundation source control or Mercurial, I've never used those. At this stage I would recommend git over SVN as it's more suited to working with a single source tree that has minor changes between lots of branches. You can do the same thing with SVN but found the process of switching the working copy to another process painful.
Team Foundation Server provides the best seamless integration to VS 2008, but of course its not free (i agree that its very expensive)
have you tried using AnkhSVN? its got a pretty good integration for VS 2008 and SVN. so far it gives me the VS-SVN integration that I need, so you might want to check it and see if it fits your needs.
you can use TortoiseSVN, but I suggest installing CollabNet's SVN server, because AnkhSVN integrates seamlessly with it, plus you dont have to worry about major installations
It's only three months until VS 2010 is in final release (March 22, 2010). For MSDN subscribers, TFS will be integrated into Visual Studio (all levels except Express). MSDN subscriptions that include Visual Studio (any level) will include TFS with a one-seat license. TFS 2010 will run on Vista or Windows 7. SharePoint is no longer required, but you still need it if you want 100% of TFS features, like reporting.
It's all available now in beta; I'm running TFS on my laptop.