I'm in the process of upgrading an existing application to .NET Core (DNX SDK 1.0.0-rc1-update2) that uses SQL Servers FILESTREAM feature for reading/writing large BLOBs to the database. It uses the SqlFileStream class to achieve this however it doesn't appear to be available in .NET Core. Here are my references in project.json:
"frameworks": {
"net451": {
"frameworkAssemblies": {
"System.Runtime": "4.0.10.0",
"System.Collections": "4.0.0.0"
}
},
"dotnet5.4": {
"dependencies": {
"Microsoft.CSharp": "4.0.1-beta-23516",
"System.Data.Common": "4.0.1-beta-23516",
"System.Data.SqlClient": "4.0.0-rc2-23623",
"System.Collections": "4.0.11-beta-23516",
"System.IO.FileSystem": "4.0.1-beta-23516",
"System.Linq": "4.0.1-beta-23516",
"System.Runtime": "4.0.21-beta-23516",
"System.Threading": "4.0.11-beta-23516"
}
}
}
I've tried searching SO and Google, both of which have absolutely nothing on the subject.
Can someone please confirm if its actually unavailable or if its in another package I'm unaware of?
I realize the question is old, but I just came across the issue - implementing SqlFileStream - listed on the github repo for CoreFX (.NET Core foundational libraries) and thought I'd mention it here. Here's a link to the issue, for reference: https://github.com/dotnet/corefx/issues/15652
To recap: The issue is implementing SqlFileStream. It's currently an open issue, but not on the horizon anytime soon. One of the contributors states "if there are any Windows specific dependencies, we may not bring it in Core."
I've actually been interested in this for a while and have taken some time over the last few days.
Unfortunately, FILESTREAM uses several NTFS and NT specific system calls (NtCreateFile, DeviceIoControl in particular, but a few others to support those as well) to manage access to the file. Also, unfortunately, as of this writing the latest MSSQL CTPs for Linux don't support FILESTREAM, and there's little clarity as to whether that's on the roadmap or where it might be (strangely, you can restore a database that supports FILESTREAM but FileTable doesn't seem to be supported).
There are two problems here: it's not clear that replacing the NT specific APIs would respect transactional integrity (or even work at all), and it's not clear that they could ever work from a non-Windows environment anyway. Because of these facts, I don't see SqlFileStream being supported any time in the near future for core.
There is some precedent for Windows Only type of low level, for example in System.Net.Socket.IOControl. SqlFileStream could perhaps take a similar path. Alternatively, it might be possible to build a specific SqlFileStream NuGet package, but have it only be supported/runnable on Windows. I'm not sure how valuable this would be though - if you're going to P/Invoke in a Windows only way to begin with, why not just P/Invoke to a .NET 4.6.x dll?
Cross posting this to the github issue: https://github.com/dotnet/corefx/issues/15652
Edit: As an alternative to P/Invoke, you could certainly create some other kind of service (RESTful, WCF, some other pipe or TCP or even memory mapped file) in .NET 4.x for a .NET Core library or application to access.
Related
Shortly I ve Windows Server 2012 R2, AEM Forms(6.2), SQLServer(2014) and Workbench(6.2) in same server. At first when i install and configure all of them, i can check out or in my applications from Workbench succesfully. However After my software team executes some scripts at Database, we can not check in/out from workbench. The worst thing when i click check out, workbench gives any error. any log. on event log or server application. It gives nothing and don't do my transaction. I saw at forums some people have same issue but nobody writes solution.
Please if any one knows the solution, share with us. What's wrong with my workbench? what to do fix this issue?
The query that your software team ran turns off security on every single LiveCycle service and makes them run as the system user. This includes the services used by Workbench and is very bad. Some of the services rely on knowing who is logged in to operate correctly. In particular, how can LiveCycle know who has checked in/out a resource if the service always runs as system?
Your best bet is to restore the LiveCycle database - or at least the tb_sc_service_configuration table to be where it was before you ran the script.
If you need to remove security on individual services, you should do it through the admin console, but only do it for your processes. Never do it for systems services unless the Adobe documentation says it is OK.
As JeremyP pointed out, modifying the Adobe database directly is a bad idea. The database should be treated as a black box that is only manipulated by Adobe code (either by doing things in the Adobe tools or making calls to Adobe APIs).
You can either make security changes manually through the adminui (as he indicates, which is the most common way of doing it) or programatically using the Adobe client APIs. See the following links for sample code that uses the APIs:
Removing Security - http://help.adobe.com/en_US/livecycle/10.0/ProgramLC/WS624e3cba99b79e12e69a9941333732bac8-7f35.html
Setting the runAs user - http://help.adobe.com/en_US/livecycle/10.0/ProgramLC/WS624e3cba99b79e12e69a9941333732bac8-7f38.html
My company, 4Point, offers AEM Forms consulting services. We have an in-house Apache Ant library that wraps the code above to automate this (and other) common tasks that are typically required when deploying (and redeploying) AEM Forms solutions. It can be included as part of a consulting engagement.
I'm trying to build a RESTful internal web server at work using node.js, where I'm currently restricted to using a Windows 2003 Server.
I've hit a stumbling block with regards to database support however. Are there any bindings currently available for reading and writing to sqlite, PostgreSQL or MySQL on Windows based machines?
Mariano has mentioned Windows support in the future in these comments, but ideally I'd like to use something available just now as a proof of concept.
I'm author of mysql-native.
Both official (felixge node-mysql) and my driver has been successfully used under windows,
I'm using and developing it under linux/windows 50/50% time . Feel free to contact me if you have any questions
Have you tried mysql-native? It's native (mysql) which means there are no other dependencies, so should run on any platform node supports. Seems to be actively maintained also, and has some examples to show you how to use the library. Link: https://github.com/sidorares/nodejs-mysql-native
I cannot access ANY database by ANY means from within Delphi XE Professional. What I mean by accessing the data base is:
having the live database appear via
components in the Object Inspector,
when the connected property is set
to true
using the Data Explorer to create
and explore database connections
EDIT:
SORTED !!
The core problem is that communication into the database, including specifically the communication generated by the IDE and any code built using the drivers was problematic.
Confounding and masking sub-issues were:
Missing or misplaced DLLs.
The Data Explorer does not fully support dbExpress drivers.
There is a bug within the 2009 IDE code, found by Chee-Yang Chau when writing the dbxFirebird driver, which limits static linking of drivers into Delphi. It is not known if this bug extends to 2010 or XE.
When using the Object Inspector, it is easy to cause the IDE to revert changed connection parameters to their default values.
Some drivers had incorrect default values (eg assuming the client dll was always gdb32.dll irrespective of whether the database was Interbase or Firebird).
Installation of two versions of Interbase led to some clashes in database communication - server names generated by the tools were odd; and the view of the databases depended on which installation of the Interbase tools were used.
The documentation available is of varying dates; refers to different versions; and as a result often appears contradictory.
END EDIT:
Approaches tried:
Multiple databases
Multiple different drivers/components
Accessing the database through other external tools, such as IBSQL and Flame Robin.
Raising questions (here and here) on SO.
Raising questions on the support forums for Firebird, Embarcardo, and Flame Robin.
Environment:
OS: Windows 7 Ultimate 64bit:
Delphi Embarcadero® RAD Studio XE Professional Version 15.0.3953.35171
Database: W1-V2.5.0.26074 Firebird 2.5 (64 bit)
Connection technology: dbExpress
Delphi Professional does not support Firebird with the native dbExpress drivers that come with Delphi. You need the Enterprise or Architect version of Delphi in order for the native firebird dbExpress driver to work.
I have Delphi 2010 Professional. I didn't want to spend the extra money on the E or A version and I failed to read the feature matrix to see that the Firebird dbExpress driver is not available with the Pro version.
I have found a few really nice videos that show how to connect to Firebird using Delphi. However, when I tried to follow along with my Professonal version nothing worked.
Shame on me and more shame on Embarcadero for touting that Delphi supports Firebird in big bold print but not mentioning that you need the Enterprise or Architect version except in the tiny fine print.
I can now write code to access Firebird within the IDE. I have (limited, but sufficient) access to the drivers within the IDE. Specifically, the drivers appear in the Data Explorer, which can be used to generate default values for the SQLConnection (dbExpress component). These can be accessed and used within the Object Inspector. The workaround to the IDE bug quoted below is necessary to ensure the communication parameters are correct. When writing database code, it is necessary to compile in the source for the dbExpress driver.
The following code is the minimum, with minimum parameter set, necessary to establish and test a database connection:
unit Unit2;
interface
uses Classes, SqlExpr, Dialogs, dbxDevartInterbase;
var SQLConnection1 : TSQLConnection;
implementation
{$R *.dfm}
begin
SQLConnection1 := TSQLConnection.Create(nil);
with SQLConnection1 do
begin
ConnectionName := 'TestConnection';
DriverName := 'DevartInterBase';
LibraryName := 'dbexpida40.dll';
VendorLib := 'fbclient.dll';
GetDriverFunc := 'getSQLDriverInterBase';
Params.Clear;
Params.Add('User_Name=SYSDBA');
Params.Add('Password=masterkey');
Params.Add('Database=localhost:C:\Program Files\Firebird\Firebird_2_5\examples\empbuild\employee.fdb');
Open;
If Connected then ShowMessage('Connection is active');
Free;
end;
end.
The workaround, courtesy of Bob Swart on one of the Codegear forums is:
The trick is to select a
ConnectionName value, which will then
assign a value to the Driver property
and all other properties like
LibraryName, VendorLib and
GetDriverFunc.
Then, make changes - if needed - to
the subproperties of the Driver
property, and finally clear the name
of the Driver property.
This will leave all your changes in
the Params list (which you can also
manually edit if you wish).
Note: leave the ConnectionName set -
if you clear that one, the parameters
will be cleared again.
Now you can compile your application
and deploy it without the need for
dbxdrivers.exe or dbxconnections.ini
(but you need to deploy the DLLs
specified in the LibraryName and
VendorLib, of course).
Also make sure to set LoginPrompt to
False and leave LoadParamsOnConnect
set to False, too.
BOUNTY AWARD
I have awarded the bounty to this answer as it was the one that pointed me away from investigation of the IDE, its installation and configuration, to investigation of the connection into the database.
END
DK about Firebird 64 bit - -no experience. But I've always had a lot of trouble with dbExpress. Never any problem with the included IB components suite. But there is a lot of confusion with IB versions...
But IMO you're best served using the ADO ('DBGo') components as opposed to any proprietary IB or Delphi specific drivers. What you need is an ADO provider for IB, available #:
http://www.ibprovider.com/eng/ - and as others have said, avoid using localhost, use 127.0.0.1, or better still, determine the true IP address of your workstation (ping machine-name...) And are you sure you don't have some kind of firewall or intrustion protection that may be involved?
You also need to make sure that your IB connection is configured properly - local or TCP, and no, don't use quotation marks for your names, pwrds, etc. The error message you got seems to indicate that you're trying to connect via TCP and it's not properly configured. What happened between the time it worked and the time it didn't work? Shut down Delphi? Reboot the machine? Explain please....
No 'special permissions' are needed - you simply need to ensure that your database server and client are properly installed and configured. In terms of functionality you can do everything with the pro version - just that the drivers etc aren't included in the package.
Again, IMO go for ADO and you'll never look back.
HTH,
MNG
Have you tried Paradox via the Borland Database Engine (BDE) and related components: TTable TQuery TStoredProc TDatabase and TSession?
If memory serves me correctly, at least as far back as Delphi 3, the distinguishing factor between "professional" and the "higher-level" editions has been the type of database development "out the box".
In Delphi 1, the BDE was the only way to do out-the-box database development.
Delphi 2 permitted a custom database layer by abstracting parts of the database component hierarchy.
Delphi 3 Professional provided BDE and drivers for file-based databases and Interbase.
One level up (Enterprise?*) they provided BDE drivers for typical client-server database access: SQL Server, Sybase, Interbase, Oracle,... (and native drivers for Interbase)
Another level up (Architect?*) introduced multi-tier development with Midas. Unfortunately, Borland took a step back with Midas, because the multi-tier components were again hard-wired to the BDE. (This was resolved in Delphi 4.)
?* Please note, I may be mistaken about the exact naming of these editions. Around about that time I formed the opinion that Borland was merely coming up with "grander" names in order to charge more for features that didn't really offer as much benefit as the 'big-cheque-writing-CIOs' came to believe - leaving developers to deal with the fallout. (Yes, I have battle-scars from Midas I.)
Rant aside, the theory was....
If one embarked on entry-level database development, you would purchase Delphi ?? Professional. Develop your system against a file-based database or Interbase via the BDE.
If you later needed to scale-up: you would upgrade Delphi, purchase your chosen SQL RDBMS, switch your connectivity via the TDatabase component, and apply the few necessary tweaks.
NOTE: In Delphi 3, you could switch to Native Interbase (personally not recommended) or use third-party components for non-Midas development. From Delphi 4 up, ADO and DevExpress started receiving more attention and nowdays, the BDE seems to be pretty much forgotten.
Of course theory & practice seldom frequent the same pubs. However, with a few cautionary pointers, you should be able to develop a significant file based solution that can be upgraded relatively painlessly.
Keep your business logic out of the database. This is quite possibly the biggest and most frequently encountered error. Huge chunks of systems are often written in triggers and stored procedures, making it more difficult to maintain or migrate a system.
Avoid platform-specific database techniques. This should go without saying, but if you don't explicitly look out for them, you will encounter problems.
Particularly relevant to file based database systems, many support special locking mechanisms - avoid them! They don't scale well to large multi-user systems in any case.
Generating of artificial keys often varies by platform: Generators, IDENTITY columns, how you get the new value.
Plan your system for large volumes of data. Identify the high-transaction tables, and avoid using uncontrolled retrieval of all records. I'd also avoid the TTable in this situation - BDE does a lot of interesting background things with TTable, and behaviour can vary according to driver and platform.
Disclaimer: All this was a long time ago, so some of the details may be a bit sketchy.
Disclaimer2: I don't have any experience with Delphi XE specifically. I currently use D5 professionally, and D2009 in my personal capacity.
Can someone recommend a source control product that does all of the following:
Seamless integration into VS 2008 Pro
Will allow me to create different "editions" of a program (like "express" and "pro") - maybe with branching?
Will allow me to track changes for specific client requests. Say I have four clients, 2 on express, 2 on pro. I would be able to create specific, customized changes for all clients while still maintaining a singular codebase.
I'm not sure if something like VisualSVN can handle this, but there must be a product out there.
Virtually every source control will satisfy the #2 and #3 requirement with branches.
For #1 it's more tricky. If you really want a Seamless integration (capital S) then Team Foundation Server is your only choice. (It's very expensive)
Otherwise virtually all the major source control systems will have some sort of VS plugin, but the plugin usually doesn't work very well.
The two most popular free source control systems are:
Subversion
git
The best way to create different additions of your software using the same code in all of the different versions it to use pre-processor directives to conditionally compile your software based of flags that you set.
For information on conditional compilation please see the following links:
.NET: http://msdn.microsoft.com/en-us/library/9ae6e432%28VS.71%29.aspx
Java: http://weblogs.java.net/blog/schaefa/archive/2005/01/how_to_do_condi_1.html
C++: http://www.devarticles.com/c/a/Cplusplus/C-plus-plus-Preprocessor-The-Code-in-the-Middle/3/
I hope this answers your question I use this alot when developing different version of applications for different platforms.
An example of this is an application that I developed in c# for both a server and mobile device implementation. Each had different ways of calling functions in .NET libraries but the logic was the same so I used preprocessor conditional compilation to compile to correct code for each platform but leave the logic intact.
From experience you only need integration with Visual Studio if you need to check out the file before editing it (a-la SourceSafe) and the file is read-only until then.
Having used SourceSafe I went on to using SVN and absolutely never looked back. Then I switched to git and again never looked back on SVN or Sourcesafe.
I can't comment on Team Foundation source control or Mercurial, I've never used those. At this stage I would recommend git over SVN as it's more suited to working with a single source tree that has minor changes between lots of branches. You can do the same thing with SVN but found the process of switching the working copy to another process painful.
Team Foundation Server provides the best seamless integration to VS 2008, but of course its not free (i agree that its very expensive)
have you tried using AnkhSVN? its got a pretty good integration for VS 2008 and SVN. so far it gives me the VS-SVN integration that I need, so you might want to check it and see if it fits your needs.
you can use TortoiseSVN, but I suggest installing CollabNet's SVN server, because AnkhSVN integrates seamlessly with it, plus you dont have to worry about major installations
It's only three months until VS 2010 is in final release (March 22, 2010). For MSDN subscribers, TFS will be integrated into Visual Studio (all levels except Express). MSDN subscriptions that include Visual Studio (any level) will include TFS with a one-seat license. TFS 2010 will run on Vista or Windows 7. SharePoint is no longer required, but you still need it if you want 100% of TFS features, like reporting.
It's all available now in beta; I'm running TFS on my laptop.
When creating an auto updating feature for a .NET WinForms application, how does it update the DLLs and not affect the currently running application?
Since the application is running during the update process, won't there be a lock on the DLLs (because those DLLs will have to be overwritten during the update).
Usually you would download the new files into a separate area. Then shutdown and restart and at startup you look for and use the new files if found. Always keeping a last known working version on the side so that the user can revert to something that definitely works if the download causes problems.
ClickOnce is a good technology from Microsoft that does this for you and you can use it directly from Visual Studio 2008.
You'll have to shutdown your application and restart it, as other people have already commented.
I wrote an open-source code to do just that in a transparent mode - including an external update application to do the actual cold update. See http://www.code972.com/blog/2010/08/nappupdate-application-auto-update-framework-for-dotnet/
The code is at http://github.com/synhershko/NAppUpdate (Licensed under the Apache 2.0 license)
I have a seperate 'launcher' application that checks for updates via a web service. If there are updates, it downloads them and then executes my application, which is in a seperate assembly.
The other alternatives are using things like ClickOnce, or downloading the files to a seperate area and restarting the app, as someone else mentioned.
Be warned about ClickOnce, though - it's not as flexible as it sounds. And if you deploy to a system that requires elevating your program to a higer security level to run, you might run into problems if you don't have a certificate for your app installed. I found it very difficult to get straight answers on the Internet to things like certificate management when it comes to ClickOnce. If you have a complex app, you may want to just roll your own updater, which is what I ended up having to do.
If you publish via ClickOnce, all of that tends to be handled for you. It has it's own pro's and con's but usually easier than trying to code it all yourself.
Both Wikipedia and 15seconds have decent info on using ClickOnce, how it works, etc.
As others have stated, ClickOnce isn't as flexible as rolling your own solution but it is a LOT less complicated. It has a small learning curve at first, but with pretty much everything bundled into Visual Studio and the use of Wizards, it usually doesn't take long to stumble onto a working solution.
As deployments get more complex (i.e. beyond than just having prerequisites or application code that needs updating) and you need to do a lot of post-install or pre-install tasks, there are things like WiX which give you somewhat of a hybrid solution between Windows Installer and ClickOnce, with the cost of flexibility being a much steeper learning curve.
The only reason I try to avoid custom installers is that you end up spending way too much time trying to get it just right to handle a bunch of different "What If" scenarios...
These days Windows can do such updates automatically for you with AppInstaller if your app is packaged in the MSIX package.
It downloads the new version of the app in another folder inside ProgramFiles\WindowsApps, then when a user runs the app via the start menu, the system knows what folder it should use. The previous version gets deleted when not in use.
If you want to know how to package your app this way I collected my findings in this answer.