Using HL7 message to insert data in my SQL Server - sql-server

I'm using mirth for sending and receiving HL7 message.
Is it possible to insert custom data (char datatype) in my SQL Server database by picking up a HL7 message (file type) mapping it with my columns of my database using transformer and inserting to my database.
And is there any option of generating status in a outbound HL7 message in mirth ?

You can use a destination connector type of Database Writer to write data from the input HL7 message to your database.
You can use a second destination connector to generate an output HL7 message based on the input.
You mentioned status -- can you be more explicit? What status, from the database call or something else?
You could add a ZZZ segment to the outbound message to hold whatever status information you need to send.
EDIT:
Here's how to use javascript to add a ZZZ segment.
createSegment('ZZZ', msg);
msg['ZZZ']['ZZZ.1']['ZZZ.1.1'] = "This is ZZZ.1"; // These are a pain to type!
msg['ZZZ']['ZZZ.2']['ZZZ.2.1'] = "Field ZZZ.2 can contain whatever you want";
msg['ZZZ']['ZZZ.3']['ZZZ.3.1'] = "such as date, time, results of database update";

The UltraPort MS SQL Schema Engine does exactly what you're looking for. That's all that it does, it's very fast and very good at it, and has free fully functional trial. It sets up in literally minutes and they've got really good customer service. If you call in they'll walk you through a 10-15 minute example of importing HL7 messages (and actually encourage you to use your own HL7 data if you have any). 10-15 minutes will answer 90% of any questions you might ever have and it includes downloading and installing the software.
Home Page: http://www.hermetechnz.com/EasyHL7/prod_sql.asp
Online Help: http://www.hermetechnz.com/Documentation/UltraPort/MSSQL/index.html
It stores both the unparsed HL7 message as well as breaking it into parsed data tables as well as (optionally) storing the unparsed SEGMENTS as individual rows.
Also, you can easily customize the HL7 Version definitions to include "Z" segments or custom datatypes used by specific vendors.
Hope this helps.

Related

How to write a message recovered from an MS Exchange server via JavaMail as an EML without parsing

I am not a programmer. I am a software solution designer. For compliance reasons I have to recover the messages received in our MS Exchange mailboxes and save them in their original form as an EML file, before I can save them to our CRM database for treatment by backoffice personnel.
My question is : Can I read the message and write the inputstream directly to the EML file without parsing it?
Our Sofware Architect and Judicial teams want to be sure that the contents are exactly equal to the original received e-mails in case of regulatory audit/investigation.
Also, can I save it to a blob-type database field and create a link that can be used to download the file?
Sorry for my lack of knowledge, but I am original a COBOL analyst.
Thanks!
Our CRM software package uses JavaMail to send and receive e-mail messages from our MS Exchange Server, and stores them in the package database (Oracle Exadata) for issue/ticket management.
EML files are MIME format. You can use JavaMail to read the MIME content of a message and write it to an EML file without parsing it first.
However, note that Exchange does not store the message in MIME format. So, even though the message may be received in MIME format, Exchange may transform it into its own internal format, and then transform it back to MIME when the message is read. Depending on your Exchange configuration, this transformation may or may not preserve the original MIME content exactly. In some case Exchange will transform the message into a different MIME format that it thinks will be easier for the client to process. I'm not an Exchange expert so if this concerns you you'll need to look into the Exchange documentation in more detail.

Text message via SQL Server

We have some to critical jobs in our production system. We receiving email about of job status succeded or failed and also we want to set send text message from SQL Server. I searched but looks not so possible, has anybody tried before ?
I have done this a long time ago using a third-party tool that offers exactly what you need. You can find out more about it here.
And here you can find detailed information on how to send and receive messages from SQL Server, using this tool.
In the website it is stated that you can only use this for free for the first 30 days.
Basically you will need to host their small database on your SQL Server and then insert a row into a Messages table with the message you want to send along with the phone number to which it has to be delivered.
"The SMS Messaging Server service application will pickup any newly added messages and send them out through the appropriate channel automatically." as stated in the documentation.
However, because it is a third party tool and not open source, I don't know if it will be suitable for you. I am not aware of any free tools for the same purpose.
I have found this link where it is explained how to turn emails into text messages, it has an inconvenient, however: you have to know the cellular provider for each of the phones you want to send message.
https://www.mssqltips.com/sqlservertip/2922/sql-server-alerts-with-text-messaging-from-sql-server-database-mail/
Best regards,

Obtaining Raw Data from NagiosXI and/or OPSview

I am currently working on completing my Masters Thesis project. In order to do so I need to be able to obtain the raw data accumulated in NagiosXI and/or OPSview. Because both of these are based off of the Nagios core, I assume the method to obtaining the raw data may be similar. This raw data is needed so that I can at a later time perform specific statical calculations which relate to my Masters Thesis. I have looked online and so far found some Nagios plugins which obtain raw data and then manipulate it for graphs and visuals, but I need the raw numbers in order to complete my calculations.
I am also researching to see if I can create maybe a PHP script, or some other language, that will extract the data from Nagios and save it in a word or excel document. However, this would be a bit of extra work as I am unfamiliar with either PHP or MySQL queries. Because of this I hope to be able to find a plugin, or something similar, that can get the data for me.
Cyanide,
I can't speak for NagiosXI, but I can for Opsview :)
You could access the data that is stored in the RRD files. You can use rrdtool dump to pull the values out or use a URL like: /rrdfetch?start=1307608993&end=1307695393&hsm=opsview%3A%3ACheck%20Loadavg%3A%3Aload1&hsm=opsview%3A%3ACheck%20Loadavg%3A%3Aload5
And this returns back the JSON data points. This is undocumented, but is used to power the interactive javascript graphing.
Alternatively, if you have ODW enabled with full statistics, then the raw data is stored in the ODW database and you can then extract the raw data with SQL commands. See http://docs.opsview.com/doku.php?id=opsview-community:odw for more information.
Ton
You can try use mk livestatus http://mathias-kettner.de/checkmk_livestatus.html
or http://exchange.nagios.org/directory/Addons/APIs/JSON/Nagios2JSON/details
All this tools get you status data without need to go to DB or status file. While XI is based on Nagios it can still work with him.
Please take a look at http://dmytro.github.com/nagira
It's a web services API to access Nagios data. You can get all hosts, service status data, objects configuration in multiple formats JSON, XML or YAML.

BizTalk 2006 - Copy a received file to a new directory

I want to be able to copy the file I have which comes in as XML into a new folder location on the server. Essentially I want to hold a back up of the input files in a new folder.
What I have done so far is try to follow what has been said on this forum post - link text
At first I tried the last method which didn't do anything (file renaming while reading). So I tried one of the other options and altered the orchestration and put a Send shape just after the Receive shape. So the same message that comes in is sent out to the logical port. I export the MSI, and I have created a Send Port in the Admin console which has been set to point to my copy location. It copies the file but it continues to create one every second. The Event Viewer also reports warnings saying "The file exists". I have set the Copy Mode of the port to 'overwrite' and 'Create New', both are not working.
I have looked on Google but nothing helps - BTW I support BizTalk but I have no idea how pipelines, ports work. So any help would be appreciated.
thanks for the quick responses.
As David has suggested I want to be able to track the message off the wire before BizTalk does any processing with it.
I have tried to the CodePlex link that Ben supplied and its points to 'Atomic-Scope's BizTalk Message Archiving Pipeline Component' which looks like my client will have to pay for. I have downloaded the trial and will see if I have any luck.
David - I agree that the orchestration should represent the business flow and making a copy of a file isn't part of the business process. I just assumed when I started tinkering around I could do it myself in the orchestration as suggested on the link I posted.
I'd also rather not rely on the BizTalk tracking within the message box database as I suppose the tracked messages will need to be pruned on a regular basis. Is that correct or am I talking nonsense?
However is there a way I can do what Atomic-Scope have done which may be cheaper?
**Hi again, I have figured it out from David's original post as indicated I also created a Send port which just has a "Filter" expression like - BTS.ReceivePortName == ReceivePortName
Thanks all**
As the post you linked to suggests there are several ways of achieving this sort of result.
The first question is: What do you need to track?
It sounds like there are two possible answers to that question in your case, which I'll address seperately.
You need to track the message as received off the wire before BizTalk touches it
This scenario often arises where you need to be able to prove that your BizTalk solution is not the source of any message corruption or degradation being seen in messages.
There are two common approaches to this:
Use a pipeline component such as the one as Ben Runchey suggests
There is another example of a pipeline component for archiving here on codebetter.com. It looks good - just be careful if you use other components, and where you place this component, that you are still following BizTalk streaming model proper practices. BizTalk pipelines are all forwardonly streaming, meaning that your stream is readonly once, and all the work on them the happens in an eventing manner.
This is a good approach, but with the following caveats:
You need to be careful about the streaming employed within the pipeline component
You are not actually tracking the on the wire message - what your pipeline actually sees is the message after it has gone through the BizTalk adapter (e.g. HTTP adapter, File etc...)
Rely upon BizTalk's out of the box tracking
BizTalk automatically persists all messages to the message box database and if you turn on BizTalk tracking you can make BizTalk keep these messages around.
The main downside here is that enabling this tracking will result in some performance degradation on your server - depending on the exact scenario, this may not be a huge hit, but it can be signifigant.
You can track the message after it has gone through the initial receive pipeline
With this approach there are two main options, to use a pure messaging send port subscribing to the receive port, to use an orchestration send port.
I personally do not like the idea of using an orchestration send port. Orchestrations are generally best used to model the business flow needed. Unless this archiving is part of the business flow as understood by standard users, it could simply confuse what does what in your solution.
The approach I tend to use is to create a messaging send port in the BizTalk admin console that subscribes to your receive port. The send port will then just use a standard BizTalk file adapter, with a pass through pipeline.
I think you should look at the Biztalk Message Archiving pipeline component. You can find it on Codeplex (http://www.codeplex.com/btsmsgarchcomp).
You will have to create a new pipeline and deploy it to your biztalk group. Then update your receive pipeline to archive the file to a location that the host this receive location is running under has access to.

Development standards for SQL Server supporting services?

I am trying to find some development best practises for SQL Server Reporting Services, Analysis Services and Integration Services.
Does anyone have some useful links or guidance they can offer on this subject?
I can only talk specifically to SSIS although some of this wil be applicable to the others as well.
Save your packages as files and put them in Source Control.
Where possible use variables for things that will change from server to server or run to run.
Use configuration files to save the configuration for differnt environments.
When processing data that comes from an outside source, assume it will change format without warning (ie check to see that the data you expect in each column is the data you got!) Nothing like putting the emails in the lastname field (or as happened to us once in DTS, the social security number into the field that said how much to pay the person, sure glad we caught that before someone got paid that amount.).
Things I have seen happen include adding new columns, removing columns that are critical to your process, reaarranging the order of the columns (especially bad when the file itself does not have the column names), leaving the column titles the same but changing the data they contain (yes once I got a file where the last name data was in the column labelled First_name and vice versa), data with new values that don't have a match to values in your system (i'm think of look up type things here like medical specialties), flat out strange data such as notes in an email field, names in this format lastname - 'Willams, Jo' first_name - 'hn' (combine the two fields to get the whole name - apparently their data entry people just typed the name until they ran out of spaces and continued on in the next field no matter where they were in the name!).
Don't put uncleaned data into your database.
Always retain a copy of any files that you process or send out. Amazing how often you will need to research back.
Log errors and log records that needed cleaning, espcially if the problem in the field was such that it caused the process to fail. It is a whole lot easier to see the errors in a table than to know your 20 million record file failed because one record had an extra | in it and try to figure out which one it was.
If you do a lot of similar imports in SSIS, create a template project that has all the standard logging and data cleaning it it. It is a whole lot faster to start from a template and adjust to new mappings based onteh new file you are working with and make minor adjustoments to things specific to that file than to rewrite every SSIS package from scratch.
Store meta data. Sooner or later you will be asked, how often did it fail or how soon after the file was received did the import happen or even when was the last import. All our pacakges start and end with a task to store start and stop times in our meta data table. All failure paths include a task to mark the import as failed in our meta data. Eventually you can build a system that knows how many records to expect and fail it if the new file is significantly off. Meta data can also be used to store things like number of records which can help identify when they sent a partial file instead of the whole file you were expecting and prevent you from blowing away 300,000 sales targets they actually still want.

Resources