I working with Spock and Groovy in order to test my application. I should need to run a ddl script before to run every test.
To execute the script from Groovy I am using the following code:
def scriptToExecute = './src/test/groovy/com/sql/createTable.sql'
def sqlScriptToExecuteString = new File(scriptToExecute).text
sql.execute(sqlScriptToExecuteString)
The createTable.sql is a complex script that do several drop and create operation ( of course it is multiline ). When I try to execute it I got the following exception:
java.sql.SQLSyntaxErrorException: ORA-00911: invalid character
To notice that the ddl is correct since that it has been checked running it on the same DB that I am connecting with groovy.
Any Idea how to resolve the problem?
I think JDBC does not support this, but there are tools/libraries that could help, see this answer for Java.
In Groovy, using this JDBC script runner would be something like:
Connection con = ....
def runner = new ScriptRunner(con, [booleanAutoCommit], [booleanStopOnerror])
def scriptFile = new File("createTable.ddl")
scriptFile.withReader { reader ->
runner.runScript(reader)
}
Or, if your script is "simple enough" (ie no comments, no semicolons other than separating statements...), you can load the text, split it around ; and execute using sql.withBatch, something like that:
def scriptText = new File("createTable.ddl").text
sql.withBatch { stmt ->
scriptText.split(';').each { order ->
stmt.addBatch order.trim()
}
}
If you can't get it done in JDBC (See Hugues' answer), consider executing sqlplus from your Groovy program.
["sqlplus", CREDENTIALS, "#"+scriptToExecute].execute()
Related
I’m trying to create a schema using context.
Database.ExecuteSqlCommand(“CREATE SCHEMA #p0”, <schemaNameParameter>)
It is giving the below error:
42601: Syntax error at or near $1
I tried other solutions given for similar question for DML, but they did not work in my case.
I found a solution, may not be very elegant, but works:
var con = _context.Database.GetDbConnection();
con.Open();
try
{
using (var cmd = con.CreateCommand())
{
cmd.CommandText = $"CREATE SCHEMA IF NOT EXISTS \"<schemaName\">";
await cmd.ExecuteNonQueryAsync();
}
}
finally
{
con.Close();
}
PS: Still trying to find a solution which takes a parameter rather than inline value for the schema name. (Note: I'm using regex validation before executing this command to ensure there is no injection attempt)
I've literally no experience in VB script or C#. I've created this SSIS package using some online tutorial which server my purpose but I've to fine-tune it to fit my requirements.
Current Scenario:
I'm trying to run an SSIS package which has a for-each loop container which imports the files with *.txt extension in a directory as the file names are not constant. This for-each loop container is followed by some other SQL tasks.
The package is executed successfully even when there are no files in the directory (May be I did something wrong while creating the container and data flow tasks, file system tasks). This is causing the SQL script at the end of the for-each loop container to execute successfully which is resulting in wrong data.
Requirement:
The package should fail if there is no file in directory. I've to implement a script before for-each loop container but not sure how to do it. Any leads would be appreciated!
I did something like this but not sure how to search wrt extension rather than file name:
Public Sub Main()
'
' Add your code here
'
Dim fileName As String
fileName = "filename.txt"
If System.IO.File.Exists(fileName) Then
Dts.Variables("User::bolFileExists").Value = True
Else
Dts.Variables("User::bolFileExists").Value = False
End If
Dts.TaskResult = ScriptResults.Success
End Sub
You should use System.IO.Directory.GetFiles() function.
If System.IO.Directory.GetFiles(<your path goes here>, "*.txt", SearchOption.AllDirectories).Length = 0 Then
Dts.Variables("User::bolFileExists").Value = False
Else
Dts.Variables("User::bolFileExists").Value = True
End If
Below would be my suggestion, I did the same in one of my requirements using event handling section that the underlying DFT is not run, then the script in event handler page would raise error. The point to be noted is that the DFT runs atleast once if there is any file in directory, raising error if it not runs would be simple rather than writing a complex script
Thanks,
Srinivas
I am attempting to use multiple datasources in a Grails 2.4.4 project. According to the docs, this should be possible:
http://www.grails.org/doc/2.4.4/guide/conf.html#multipleDatasources
My primary dataSource (the one I want to use for all domain classes) is using H2 at the moment, as configured by the default DataSource.groovy configuration. My second, read-only datasource is SQL Server, and I tried to declare it as follows at the top level of my DataSource.groovy config (shared by all environments):
ds {
pooled = true
dialect = "org.hibernate.dialect.SQLServer2008Dialect"
driverClassName = "net.sourceforge.jtds.jdbc.Driver"
url = "jdbc:jtds:sqlserver://myserver:1433/mydb;domain=mydomain;useNTLMv2=true;user=myuser"
dbCreate = "none"
}
(Don't let the URL throw you off - I'm just having to use Windows Auth with JTDS. I've tested this via third-party clients as well.)
I inject this into my service class and use it, and everything appears to hook up well:
def dataSource_ds
def serviceMethod(){
Sql ds = new Sql(dataSource_ds)
String query = "SELECT ... "
def results = ds.rows(query)
println "Results are ${results.size()}"
return "Some value"
}
But when I try to access this from an IntegrationSpec-backed Integration Test, I noticed that I was getting "schema not found" errors for valid schemas referred to by my query string, such as "dbo". And the stack trace of any errors from this setup looks like this:
org.h2.jdbc.JdbcSQLException: Schema "DBO" not found; SQL statement:
...
at org.h2.message.DbException.getJdbcSQLException(DbException.java:329)
at org.h2.message.DbException.get(DbException.java:169)
at org.h2.message.DbException.get(DbException.java:146)
at org.h2.command.Parser.readTableOrView(Parser.java:4774)
at org.h2.command.Parser.readTableFilter(Parser.java:1083)
at org.h2.command.Parser.parseSelectSimpleFromPart(Parser.java:1689)
at org.h2.command.Parser.parseSelectSimple(Parser.java:1796)
at org.h2.command.Parser.parseSelectSub(Parser.java:1683)
at org.h2.command.Parser.parseSelectUnion(Parser.java:1526)
at org.h2.command.Parser.parseSelect(Parser.java:1514)
at org.h2.command.Parser.parsePrepared(Parser.java:404)
at org.h2.command.Parser.parse(Parser.java:278)
at org.h2.command.Parser.parse(Parser.java:250)
at org.h2.command.Parser.prepareCommand(Parser.java:217)
at org.h2.engine.Session.prepareLocal(Session.java:414)
at org.h2.engine.Session.prepareCommand(Session.java:363)
...
Now why would THIS datasource be trying to use the H2 driver?
In case it's relevant, my Integration test looks like this:
void "serviceMethod" () {
when: "service method is called"
String response = myService.serviceMethod()
then: "we should get the appropriate text back"
response.equals("Some value")
}
If, in the Service class, I hard-code the connection using a constructor of the Groovy Sql object, the integration test works fine, and any stack traces go through the JTDS driver.But when I try to use the injected datasource, things are strange.
Any idea what I'm doing wrong here?
Just to close the loop on this and hopefully save someone pain on this oversight in the future:
Grails uses an in-memory database when running tests. Make sure to read up on the other differences between integration tests and production here:
http://www.grails.org/doc/latest/guide/testing.html#integrationTesting
This feature makes the use of external (read-only) datasources during any tests pretty interesting, but some of that is to be expected (a test which depends on an external datasource is not a very good test in the long run). I hope to refactor my app and its testing approach at some point (e.g., to use a simple DAO and mock that during the test), because I don't really care about asserting the contents of the external datasource from my app's tests.
I'm trying to run a powershell script through batch job, I used the following code that works fine in a job :
System.Diagnostics.Process process;
System.Diagnostics.ProcessStartInfo startInfo;
;
process = new System.Diagnostics.Process();
startInfo = new System.Diagnostics.ProcessStartInfo();
startInfo.set_FileName("powershell.exe");
startInfo.set_Arguments("D:\\Documents\\OP3_FTP_Upload.ps1");
startInfo.set_UseShellExecute(false);
startInfo.set_RedirectStandardError(true);
process.set_StartInfo(startInfo);
process.Start();
when I use this code in a runbasebatch class, I have the following errors:
Failed to request the permission of type 'InteropPermission'.
Unable to create object 'CLRObject'
So I try to use the following to solve my permission problem:
Set permissionSet;
InteropPermission interopPermission;
;
interopPermission = new InteropPermission(InteropKind::ClrInterop);
permissionSet = new Set(Types::Class);
permissionSet.add(interopPermission);
CodeAccessPermission::assertMultiple(permissionSet);
...my first code example
CodeAccessPermission::revertAssert();
When I execute my batch job , I have no error message but nothing happens. The path is correct, the script also (parms corrects based on AOS)
I think the problem is my way to implement permissionSet and interopPermission classes, I know how to use it in case of CRUD operations on files, but how to use it in case of script execution? Can anyone explain me how (if possible)to manage those classes in my case of use?
Any other ideas to solve my problem are welcome.
This should be enough (in the method doing the CLR calls):
new InteropPermission(InteropKind::ClrInterop).assert();
Otherwise try to debug.
I should preface by saying my experience with scripting or programming in OOP languages is limited.
I'm working on a method for programatically creating and executing SSIS packages using PowerShell. Unfortunately, most of the resources available for PowerShell and SSIS are for calling PS from SSIS, not the other way around.
I have, however, found a number of resources for VB/C# for creating SSIS packages.
Example resource here.
I've succeeded in converting most of the code by calling the DTS/SSIS assemblies, but it's failing now on converting the TaskHost object to a mainpipe.
Sample code:
[Void][Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.ManagedDTS')
[Void][Reflection.Assembly]::LoadWithPartialName('Microsoft.Sqlserver.DTSPipelineWrap')
# Create the Package and application, set its generic attributes
$Package = New-Object Microsoft.SqlServer.Dts.Runtime.Package
$Package.CreatorName = $CreatorName
$App = New-Object Microsoft.SqlServer.Dts.Runtime.Application
# Set connection info for our package
$SourceConn = $package.Connections.Add("OLEDB")
$SourceConn.Name = "Source Connection"
$SourceConn.set_ConnectionString("Data Source=$SourceServer;Integrated Security=True")
$TargetConn = $package.Connections.Add("OLEDB")
$TargetConn.Name = "Target Connection"
$TargetConn.set_ConnectionString("Data Source=$TargetServer;Integrated Security=True")
# Build the tasks
# Data Flow Task - actually move the table
[Microsoft.SQLServer.DTS.Runtime.Executable]$XferTask = $Package.Executables.Add("STOCK:PipelineTask")
$XferTaskTH = [Microsoft.SqlServer.Dts.Runtime.TaskHost]$XferTask
$XferTaskTH.Name = "DataFlow"
$XferTaskTH.Description = "Dataflow Task Host"
$DataPipe = [Microsoft.SQLServer.DTS.pipeline.Wrapper.MainPipeClass]($XferTaskTH.InnerObject)
Everything works fine til the last line, when I get the error:
Cannot convert the
"System.__ComObject" value of type
"System.__ComObject#{}" to
type
"Microsoft.SqlServer.Dts.Pipeline.Wrapper.MainPipeClass"
Any assistance or ideas are welcome!
Microsoft.SqlServer.DTSPipelineWrap makes heavy use of COM instances.
This forum post suggested using CreateWRapperOfType method:
http://social.technet.microsoft.com/Forums/en-US/ITCG/thread/0f493a31-fbf0-46ac-a6a5-8a10af8822cf/
You could try this:
$DataPipe = [System.Runtime.InteropServices.Marshal]::CreateWrapperOfType($XferTaskTH.InnerObject, [Microsoft.SQLServer.DTS.pipeline.Wrapper.MainPipeClass])
Doesn't error out and produces an object--I'm not sure of what type.
You could always just compile the working .NET version you referenced above into an exe, and allow it to accept parameters as needed in order to create the SSIS packages. Then, use Powershell to call the executable with the parameters as needed.