Trouble evoking lua stored procedure using evalsha from redis - database

I am trying to use lua scripts stored in Redis as stored procedures.
I would like to be able to store these scripts in Redis once, and look them up and evoke them when needed.
I have been able to add these functions to the :function: keyspace, using the redis-cli to add them, as follows,
redis-cli
> SET :function:f1 "redis.call('SELECT', 0);local data=redis.call('HGETALL','key:{'..ARGV[1]..'}'); print('f1'); print(ARGV[1]); return data;"
> SET :function:f2 "redis.call('SELECT', 0); local data=redis.call('HGETALL','key:{'..ARGV[1]..'}'); print('f2'); print(ARGV[1]); return data;"
> SET :function:f3 "redis.call('SELECT', 0);local data=redis.call('HGETALL','key:{'..ARGV[1]..'}'); print('f3'); print(ARGV[1]); return data;"
I have also been able to use the following script load command to build a script that can look up these commands,
SCRIPT LOAD "local f=loadstring(redis.call('get',':function:' .. KEYS[1]));return f()"
This script load command provides me with an SHA key which I can use to call one of these stored functions, which I can run from the command line, like so,
redis-cli SCRIPT LOAD "local f=loadstring(redis.call('get',':function:' .. KEYS[1]));return f()"
#returns:
"31b98f9ad6a416c27e5af91ff4af12235d4da385"
Then I can call one of the functions from redis-cli,
redis-cli
> evalsha 31b98f9ad6a416c27e5af91ff4af12235d4da385 1 f3 1234567890
But I keep getting an error,
(error) ERR Error running script (call to f_ae7d0c88e2be3f907cc9a4f5943817bc380bf68e): #user_script:1: user_script:1: bad argument #1 to 'loadstring' (string expected, got boolean)
Any ideas? suggestions?

You'll have to mangle the KEYS or the redis. namespace.
Josiah Carlson just released a python package for this.
See: here and here.
Josiah also added the package to Pypi
Hope this helps, TW

Related

SqlPackage seems to escape right square bracket ( ] ) in variable value passed to .dacpac

I'm passing a variable to my .dacpac but the text received is not what I passed. Example command:
sqlpackage /v:TextTest="abc]123" /Action:Publish /SourceFile:"my.dacpac" /TargetDatabaseName:MyDb /TargetServerName:"."
My variable $(TextTest) comes out as "abc]]123" instead of the original "abc]123".
Is there anything I can do to prevent SqlPackage from corrupting my input variables before they are passed to the .dacpac scripts?
Unfortunately, I don't think there is a good answer. This appears to be a very old bug. I'm seeing references to this issue going back 10 years.
Example A: https://web.archive.org/web/20220831180208/https://social.msdn.microsoft.com/forums/azure/en-US/f1d153c2-8f42-4148-b313-3449075c612f/sql-server-database-project-sqlcmd-variables-with-closing-square-brackets
They mention a "workaround" in the post, but they link to a Microsoft Connect issue which no longer exists and is not available on archive.org.
My best guess is that the "workaround" is to generate the deploy script rather than publishing, and then manually modify the variable value in the script...which is not really a workaround if you are working on a build/release pipeline or any sort of automation.
I tried testing this to see if it would make any difference using Microsoft.SqlServer.Dac.DacServices.Publish() directly (via dbatools PowerShell module), but unfortunately the problem exists there as well.
I also tested it against every keyboard accessible symbol and that is the only character it seems to have a problem with.
Another option, though still not great, is to generate the deployment script, then execute it using SQLCMD.EXE.
So for example this would work:
sqlpackage /Action:Script `
/DeployScriptPath:script.sql `
/SourceFile:foobar.dacpac `
/TargetConnectionString:'Server=localhost;Database=foobar;Uid=sa;Password=yourStrong(!)Password' `
/p:CommentOutSetVarDeclarations=True
SQLCMD -S 'localhost' -d 'foobar' -U 'sa' -P 'yourStrong(!)Password' `
-i .\script.sql `
-v TextTest = "abc]123" `
-v DatabaseName = "foobar"
/p:CommentOutSetVarDeclarations=True - This setting is key, otherwise SQLCMD will be overridden by what's in the file. Just make sure you specify ALL variables, and not just the one you need. So open the file to look at what is commented out and make sure you are supplying what is needed.
It's not a great option...but it's at least scriptable and doesn't require manual intervention.

Write a File from Jenkins Groovy Script-Console

I'm trying to find a way to write some content to a file using Jenkins Groovy Script-Console.
The use-case: Our CI manages some state-machine using a volume shared between all the nodes (which is in turn mapped to EFS). However - following the discovery of a bug in our CI groovy shared libs I found that some state files gone corrupt, and needed to write to them the corrected values, together with fixing the bug.
I could do that using ssh connection, however, as we're in process of abstracting out the workers we're trying to back off from that and manage ourselves only from the script-console and/or ci jobs.
I tried all these forms, all of which failed:
"echo 'the text' > /mnt/efs-ci-state/path/to/the-state-file.txt".execute().text
"""
cat <<<EOF > /mnt/efs-ci-state/path/to/the-state-file.txt
the text
EOF
""".execute().text
"bash -c 'echo the text > /mnt/efs-ci-state/path/to/the-state-file.txt'".execute().text
"echo 'the text' | tee /mnt/efs-ci-state/path/to/the-state-file.txt"
Can anybody show me the way to do that?
I'd also appreciate an explanation why the forms above won't work and/or a hint on how to execute commands that include piping and/or stdio directing from that script console.
Thanks :)
["bash", "echo the text > /mnt/efs-ci-state/path/to/the-state-file.txt"].execute().text
or use plain groovy:
new File('/mnt/efs-ci-state/path/to/the-state-file.txt').text = "echo the text"
why not working:
options 1, 2, 4 : echo and piping is a feature of shell/bash - it will not work without bash
option 3 you have c echo and c is not a valid command
use array to execute complex commands and to separate bash from main part
i suggest you to use this kind of code if you want to capture and validate stderr
["bash", 'echo my text > /222/12345.txt'].execute().with{proc->
def out=new StringBuilder(), err=new StringBuilder()
proc.waitForProcessOutput(out, err)
assert !err.toString().trim()
return out.toString()
}

How to set SplitMasterWorker value as false in giraph

I try to execute the giraph custom code from eclipse IDE, and when i try to run the code its showing Exception in thread “main” java.lang.IllegalArgumentException: checkLocalJobRunnerConfiguration: When using LocalJobRunner, must have only one worker since only 1 task at a time!
So i want to set the giraph.SplitMasterWorker=false.How to set it and where to set it?
pass -ca giraph.SplitMasterWorker=false to your application as an argument.
If you are running giraph on a single node cluster, then paste "-ca giraph.SplitMasterWorker=false" would help. However, if you try to run giraph on multi-nodes cluster such as AWS EC2 base on hadoop version 2.x.x, then I definitely recommend to modify the mapred-site.xml file adding parameter such mapred.job.tracker value in it.
giraph.SplitMasterWorker=false is the variable you have to set while calling the giraph runner. This can be passed in as a custom variable under -ca. Also I think you are using -w parameter, if you running on your local machine it should not be more than 1 since there are no slave nodes to work as a worker
E.g. hadoop jar /usr/local/giraph1.0/giraph-examples/target/giraph-examples-1.1.0-for-hadoop-2.7.0-jar-with-dependencies.jar org.apache.giraph.GiraphRunner org.apache.giraph.examples.ConnectedComponentsComputation -vif org.apache.giraph.io.formats.IntIntNullTextInputFormat -vip -vof org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op -w 5 -ca giraph.SplitMasterWorker=false

How to query Maya in script for supported file translator plugins?

I'm trying to specify an FBX file in MEL using the command
file -f -pmt 0 -options "v=0;" -typ "FBX" -o
on one computer this works great. On another, it fails but DOES work if I use
-typ "Fbx"
I think I'd like to query for the supported translators in my script, then either select the correct one or report an error. Is this possible? Am I mis-diagnosing the problem?
MEL has a command called pluginInfo. You could write a simple function that will return the proper spelling based on that. pluginInfo -v -query "fbxmaya"; will provide the version of the fbx plugin. I haven't used MEL in a while so I'm not gonna try to make this perfect but maybe something like if(pluginInfo -v -query "fbxmaya") ) string fbxType = "FBX" else( string fbxType = "Fbx"). Then just plug that var into file -f -pmt 0 -options "v=0;" -typ $fbxType -o.
It might be a different version of fbx. You'd have to provide another line which determines the version of fbx on that particular machine and pipes in the correct spelling.

How do I test a manual check in check_mk / Nagios

My organization is using Nagios with the check_mk plugin to monitor our nodes. My question is: is it possible run a manual check from the command line? It is important, process-wise, to be able to test a configuration change before deploying it.
For example, I've prepared a configuration change which uses the ps.perf check type to check the number of httpd processes on our web servers. The check looks like this:
checks = [
( ["web"], ALL_HOSTS, "ps.perf", "Number of httpd processes", ( "/usr/sbin/httpd", 1, 2, 80, 100 ) )
]
I would like to test this configuration change before committing and deploying it.
Is it possible to run this check via the command line, without first adding it to main.mk? I'm envisioning something like:
useful_program -H my.web.node -c ps.perf -A /usr/sbin/httpd,1,2,80,100
I don't see any way to do something like this in the check_mk documentation, but am hoping there is a way to achieve something like this.
Thanks!
that is easy to check.
Just make your config changes and then run:
cmk -nv HOSTNAME.
That (-n) will try run everything and return (-v) the output.
So can see the same results like later in the GUI.
List the check
$check_mk -L | grep ps.perf
if it listing ps.perf then run following command,
$check_mk --checks=ps.perf -I Hostname

Resources