Schedule Knime Workflow - sql-server

I need to schedule Knime Workflow to run daily as I wasn't able to understand/put together the steps in (knime.com/faq#q12) due to my business background. My environment details are:
Operating System: Windows Server 2012 R2.
Database: Reading from SQL Server 2017 and insert model output to the same
database.
Knime Version: Analytical Platform 3.5.2
The Knime Analytical Platform is installed on D drive.
The Workflow is saved on E drive.
could you share with me in details the needed process as I'm coming from a business background :
The needed batch file with exact commands.
Other need steps to run it daily

I use the following command on Windows systems to run an exported workflow as a scheduled task.
You must use quotes around the path of your workflow and path to your Knime executable.
"/path/to/knime.exe" -reset -nosave -nosplash -application org.knime.product.KNIME_BATCH_APPLICATION -workflowFile="/path/to/workflowFile"

Related

Job/Transformation scheduling in Pentaho

I have Pentaho-Spoon on my Windows machine and all the transformations/jobs are stored in a Database Repository.
Now, we want to set up a scheduler for the transformations and jobs.
Being a newbie, I just know that I need to use a batch file in Windows Scheduler with the address of kitchen.bat/pan.bat and address of the job/transformation to be scheduled.
Do I need to install the Pentaho Data Integration tool on the server as well on which the repository is located? And even if I do so how to get the address from the repository?
This can be done with the help of Windows Scheduler.Just need to trigger a batch file similar to the shown in the example below:
C:\\data-integration\kitchen.bat /rep:"PentahoRepository" /job:"TestJob" /dir:\<direcoryname> /user:<usernamm> /pass:<password> /level:Basic

Definitive/Guaranteed Approach to Execute Batch Script Post MSI Installation using Visual Studio 2013

I've been cobbling some post installation steps but having difficulties, which I will go into. Please recommend a suitable, ideally native, process to run some custom steps post installation.
Using:
Visual Studio 2013 Ultimate
Setup:
vb.NET project & Visual Studio Instillation Project
Test OS:
Win 7 x64 with Admin
Currently, installer successfully install main application and extracts MS SQL Express + SQL scripts to a subdirectory. I have a batch file ("InstallSQL.bat") in the same directory, which silently installs SQL Express, and then executes the SQL scripts.
So how to best execute the "InstallSQL.bat" script, when Visual Studio doesn't support batch execution, from an installer Custom Action?
Methods I've tried:
Add cmd.exe (32-bit & 64-bit) + Installer Custom Action to launch the script, as per this post. For some reason, the cmd.exe is executed with non-administrator credential, and SQL Setup fails.
Use a VBS script, to launch the batch script. VBS script does not run and error "A script required for this install to complete could not be run".
I am happy to consider an alternative approach to install SQL Express and run scripts, not based on a batch file. However, my custom batch file works perfectly when run manually i.e. not from the installer.
Please note, it needs to work on Windows XP and up, be location insensitive i.e. no static file locations and ideally, without using 3rd party utilities. This last requirement is weak.
I do that with a config file
"YourSQLExeFile" /CONFIGURATIONFILE="FullPath\YourConfigFile"
You did not specify what version of SQL you are using (2008/2012/2014, etc). You can find examples with a google search... otherwise I may be able to share what I use if you reply with SQL version.
This is all so error prone your best bet is to do it as part of the configuration of the app when it first runs.
Any custom action code that runs with user credentials (typically because it is a "just me" install) will not run elevated.
Any custom action code that runs in a per machine Everyone install will run elevated with the system account, which typically is not helpful because the system account doesn't have database credentials or network credentials (if the database is on a network share it doesn't work).
If your vbscript fails you'll need to do the install with a verbose log to get diagnostics. A common error is to use the WScript object, but that's available only if you initiate the script with WSH, and Windows Installer is not WSH.
This is why it's better to do it after the install, initiated by the user rather than by an msiexec process running with the system account or impersonating the installing user.
You best hope of doing this from the install is for you to write code to read that script and configure the database. The SQLCommand class, for example, as here:
Sql Scripot with C#
Run SQL script
Run a SQL Script
This executable needs a manifest requesting administrator privilege so when it runs it prompts for elevation. You fire off this program from your VBScript custom action. The point is that this elevation and the fact that it's shell-executed as a separate process gives some independence from the msiexec.exe service process that is calling your custom actions. Again, Everyone vs. Just me installs will make a difference, and you may find that your executable ends up running with the system account. The good news is that if this doesn't work from the install you just need to arrange for it to run the first time someone runs your app, when it will in a normal interactive user environment, and of course you can test and debug this in the normal way.

Deploying Dacpacs to an Availability Group in a locked-down production

My DBA and I are trying to work out how to effectively use Microsoft's Database projects and the Dacpacs they generate to simplify our production deployment system.
Ideally, I would be able to build and/or publish the .sqlproj, generating a .dacpac file, which can then be uploaded to the production server and used to upgrade the database from whatever version it was to the latest version. This is similar to how we're doing website deployments, where I publish to a package, and then that package is uploaded to the server and imported into IIS.
However, we can't work out how to make this work. The DBA has already created the database and added it to our Availability Groups. And every time we try to apply the Dacpac, it tries to adjust settings which it can't because of the AGs.
Nothing I've been able to do has managed to create a .dacpac file which doesn't try to impose settings on the database. The closest option I've found will exclude them when publishing, but as best as I can tell you can't publish to an inaccessible database, and only the DBA has access to the production server.
Can I actually use dacpacs this way?
There are two parts to this, firstly how do you stop deploying settings you don't want to deploy - can you give an example of one of the settings that doesn't apply?
For the second part where you do not have access to the SQL Server there are a few different ways to handle this:
Use an offline copy to generate the deploy script
Get the DBA to generate the deploy script
Get the DBA to deploy using the dacpac
Get read only access to the database
Option 1: "Use an offline copy to generate the deploy script"
You need to compare the dacpac to something and if you do not have a TDS connection (default instance default port tcp:1433) then you can use a version of the database that matches production either through:
Use log shipping to restore a copy of production somewhere you can access it
Get a development db and production in sync, then every release goes to the dev and prod databases, ensuring that they stay in sync
The log shipped copy is the easiest, if it is to a development server you can normally have server permissions to give you acesss or you can create the correct permissions at the database level but not on the production server level.
If the data is sensitive then the log shipped copy might not be appropriate so you could try to keep a development and production database in sync but this is difficult and requires that the DBA be "well trained" into not running anything that isn't first run against the db database as well.
Once you have access to a database that has exactly the same schema as the production database you can use sqlpackage.exe /action:script to generate a deploy script, in fact because it isn't the production database you can generate the script as part of your CI process :).
Option 2: "Get the DBA to generate the deploy script"
This is to get the DBA to copy the dacpac to the productions server and to use sqlpackage.exe that will be in "Program Files (x86)\Microsoft Sql Server\Version\DAC\bin" folder to compare the dacpac to the database and generate a script that he can review before deploying.
Option 3: "Get the DBA to generate the deploy script"
This is simlar to option 2 but instead of generating a script he deploys in SSMS he just use sqlpackage.exe /Action:Publish to deploy the changes directly.
Option 4: "Get read only access to the database"
This is actually my preferred as it means that you always build scripts against what is guaranteed to be the state of production (as it is production). In your case you would need to get the tcp port between your machine or ideally your build machine and the SQL Server and then you will need these permissions:
https://the.agilesql.club/Blogs/Ed-Elliott/What-Permissions-Do-I-Need-To-Generate-A-Deploy-Script-With-SSDT
As I said option 4 is always my preferred but I understand that it isn't always possible.
Option 2 + 3 are fraught with worry as you will be running scripts that haven't been tested anywhere, with option 4 and 1 you can generate the scripts and then deploy to a test / QA database as long as they themselves have the same schema as production. The scripts can also go through a code review process.
If you do option 2 / 3 then I would create a batch file or powershell script that drives sqlpackage.exe and if they deploy from a different server that doens't have sqlpackage.exe then you can copy the DAC folder to that machine and run sqlpackage from that, you do not have to actually install it (you may need to also copy in the Microsoft.SqlServer.TransactSql.ScriptDom.dll from the "Program Files (x86)\Microsoft Sql Server\Version\SDK\Assemblies" folder.
I hope this helps, if you have any more questions feel free to post here or ping me :)
ed

DB2: How to backup a DB2 database?

DB2 v10.1 database on WINDOWS 7.
Can somebody share about creating a database backup of the DB2? I could not find detailed instructions.
Thanks in advance for any help in this matter
Have you tried looking at the documentation? Perhaps the "Data Recovery Reference"?
http://pic.dhe.ibm.com/infocenter/db2luw/v10r1/topic/com.ibm.db2.luw.admin.ha.doc/doc/c0006150.html
In a db2cmd window type \DB2 HELP BACKUP\ for more complete command syntax. The simplest form of the command is
DB2 BACKUP DATABASE <database name>
Optim Studio in 9.7 and 10.1 and Control Center in 9.7 have GUI's to assist with these tasks as well.
For a local backup you can use a simple command line command also provided in the other answers:
db2 backup database <name>
If you want a more automated solution that's more for "enterprise" then you should look into IBM Tivoli Storage Manager for example. DB2 supports making backups to network storaged TSM on the fly with incremental backups without disrupting the local database from working. I.e. you can run queries while the backup is running.
For TSM you need log archiving enabled on the database, you can do that with command should be:
db2 update db cfg using LOGARCHMETH1 TSM
After you have enabled log archiving you can create a backup script and schedule it:
set DB2INSTANCE=DB2
"C:\IBM\ProductName\db2\BIN\db2cmd.exe" /c DB2.EXE backup db WPSDB user <DOMAINUSERNAME> using <DOMAINUSERPASSWORD> online use tsm include logs
Here's a link to a full tutorial: http://www.codeyouneed.com/db2-backup-using-tsm/
For detailed step by step guide to configure DB2 backup, you can refer:
DB2 v9.7 on AIX(x64) backup configuration for TSM v7.1
Every step form planning, preparation and execution is explained with diagrams.
Basic steps are:
Download Appropriate TSM API 32/64 bit based on db2level from passport advantage
Extract TSMCLI_AIX.tar
Login as root and enter "SMITTY INSTALL"
Select required components:
tivoli.tsm.client.ba.64bit,
tivoli.tsm.client.api.64bit etc.
If not using TSM client GUI then no need to install
Tivoli.tsm.client.jbb.64bit
Tivoli.tsm.filepath
Now apply steps mention in the link as example to configure for File level and DB2 level backup as per your environment.

How do I update SSIS package on the server

I have updated the package in BIDS 2005 (I changed the backup routine to save to a different drive) and now I'm trying to get it back on the server (2005). I tried File > Save Copy As... Then ran the job that executes the package and it's still saving to the old drive, thus, my package didn't get saved.
In my opinion always create a deployment utility with your SSIS Project. This is configured under the Project Properties (see below). Once you have configured the project deployment utility, go to your project, find the "bin" folder and double-click the deployment utility. I will walk you through getting your package(s) onto the server really easily.
Good Luck!
The quick and dirty answer is to use dtutil
dtutil /file C:\Src\MyPackage.dtsx /destserver thatDatabase /COPY SQL;MyPackage
I too am a fan of the manifest files but, while probably overkill for your problem, I prefer to use tools that allow for unattended use. I combine the ssisdeploymanifest with a PowerShell script to handle all of SSIS deployments.
Powershell SSIS Deployment and maintenance

Resources