Steps to generate a windows script to create user concurrency? - snowflake-cloud-data-platform

Question, what would the steps to write/create a windows (bash?) script that will allow me to create user concurrency of 3 or 4? Would I need to write a script that modifies the "config" file in the .snowsql folder and then run it from snowsql? Any guidance or high level steps would be helpful. Thank you!

Each independent invocation of snowsql will create its own connection, so you can certainly fire them up in parallel CMD or PowerShell windows. If you also want to change the user or other aspects of each connection created, snowsql accepts command-line parameters as overrides (such as --username) instead of only relying on the config file.
If your goal is to perform a load test of concurrency, a common approach is to use Apache JMeter with a DB Test Plan using Snowflake's JDBC Driver. Alternatively, you can also use a multi-threaded Python script like the one presented in this concurrency/scaling testing article.

Related

Passing custom parameters to docker when running Flink on Mesos/Marathon

My team are trying set-up Apache Flink (v1.4) cluster on Mesos/Marathon. We are using the docker image provided by mesosphere. It works really well!
Because of a new requirement, the task managers have to launched with extend runtime privileges. We can easily enable this runtime privileges for the app manager via the Marathon web UI. However, we cannot find a way to enable the privileges for task managers.
In Apache Spark, we can set spark.mesos.executor.docker.parameters privileged=true in Spark's configuration file. Therefore, Spark can pass this parameter to docker run command. I am wondering if Apache Flink allow us to pass a custom parameter to docker run when launching task managers. If not, how can we start task managers with extended runtime privileges?
Thanks
There is a new parameter mesos.resourcemanager.tasks.container.docker.parameters introduced in this commit which will allow passing arbitrary parameters to Docker.
Unfortunately, this is not possible as of right now (or only for the framework scheduler as Tobi pointed out).
I went ahead and created a Jira for this feature so you can keep track/add details/contribute it yourself: https://issues.apache.org/jira/browse/FLINK-8490
You should be able to tweak the setting for the parameters in the ContainerInfo of https://github.com/mesoshq/flink-framework/blob/master/index.js to support this. I’ll eventually update the Flink version in the Docker image...

Azure Automation DSC - Permission and Module Issues

Are there any Azure Automation DSC gurus who can help with some guidance and know-how for pushing through a couple impasses I am currently encountering?
The task at hand is: Use Azure Automation Runbook to provision a VM. That VM should immediately be associated with a DSC configuration, which will adjust Windows features, settings, and install SQL Server according to a specific configuration. All tasks conducted need to be written in PowerShell and should require no manual input via Azure portal at any point.
At this time, the Runbook provisioning the VM is working perfectly. However, associating this new node with a DSC configuration is still a manual process, which also is working (with the exception of the next issue mentioned below). However, this process needs to be automated instead. How is this done? Via DSC resources as children of the VM resource in the ARM template?
Getting SQL Server installed is the next step. The xSQLServer DSC module seemed perfect for achieving this, but it currently has a bug in Azure Automation, which means that the xSQLServerSetup resource is not available, even when using older versions of xSQLServer. So, there appear to be two possible workarounds to this…
Workaround 1: Not use xSQLServer and just run a PS script that is local on the newly provisioned VM to install SQL Server using a command line installation using an INI file. The PS script to install SQL works, but only when run manually. When attempting to have DSC run this script, Azure is throwing an error that the script is not digitally signed. So, there appears to be a permissions scoping issue at play, and the DSC credential is not able to run the local PS script even though the local admin credential is being passed in. How does one get around this?
Workaround 2: Apparently, it is supposed to be possible to provision a VM, compile the DSC MOF local on that machine (with the full version of xSQLServer), and then push that registration back to Azure Automation. Though, it is unclear how exactly this would be done, as it appears to also require the execution of a local PS script, thus providing the same impasse as the first workaround. Is this perhaps via a Custom Script extension in the ARM template, or…?
I can see all of the parts in play, and I’ve found several helpful resources online that give breadcrumbs to the solution. But, the breadcrumbs are too far apart, and the proper way of wiring everything together is proving to be elusive. So, I’m here humbly asking for help and guidance in getting this worked out.
Any help would be greatly appreciated.
Thanks!
First of all that's a lot of questions instead of 1.
unless this is some kind of homework - there is no point in installing sql on a vm, there are a lot of vm + sql images in Azure and it would take 5 minutes instead of 60 to provision such a vm.
When attempting to have DSC run this script, Azure is throwing an error that the script is not digitally signed. - this means your script is not signed (not related to rights\permissions), look for execution policy, you need to set it to unrestricted before running your script (but you don't need to, because of the first point).
you compile mof or upload it and then you can "tie" a vm to that mof, it can be automated with powershell (both parts), there are a lot of guides on how to do that. Like this
As a general rule, use arm template to do the whole thing, again, there are lots of examples on how to achieve that (just browse this repo). Provisioning infrastructure with powershell (on azure) is not the best way of doing things.

Need to make a .bat which login on several wesites and check if they are working fine or not?

I need to automate a manual process in which a user need to login on several wesite with specific username and password and also checks some links(shared locations). If everything is normal then user send a mail to the specific group in table format.
Can anyone give me any idea, how to start the automation process. I am thinking of using .bat and .vbs.
Thanks in Advance.
Here's what I would do:
Set up a headless Ubuntu machine (either on-premise or in the cloud)
Install Jenkins
Get familiar with creating Jenkins jobs and configuring them to email your group upon success or failure
Install the PhantomJS web browser on the Ubuntu box
Use the Python Selenium bindings to write a web automation script that uses PhantomJS to perform your web tests
Finally, create a Jenkins job that runs that test script and emails you the results
With that infrastructure in place, you'll have a solution for the task at hand, and a foundation on which you can build many more tests and processes.
If you have a hard requirement to use Windows and .bat files, you can do that too - just install Jenkins on a Windows machine and configure your Jenkins jobs to execute batch commands. I still recommend using Python even if running on Windows, though.
I think bat won't help you because it can't actually interact with a browser e.g. to enter username and password.
I'd go with AutoIt. It's simple and powerfull and has a nice framework for browser interaction.
If you want to use something complicated but really powerfull (C#), check the white framework by teststack.

Is PAA a good candidate for automating wcm library deployment and setup in portal?

I have created a Web Content Management library for use in WebSphere Portal. At the moment I'm using import-wcm-data to import the library, then I need to add some additional propeties to 2-3 files on the server under Resource Environment Providers and then restart particular services so those changes are detected.
Can anyone explain the benefits of using a paa over writing a simple bash (or similar) script to automate this process?
I don't understand if I get any advantages when using paa, or is paa even capable of updating properties files and restarting services?
I have been working intensively with PAA files and I must say that it is a very stable way of deploying a app requirering multiple depl steps and components.
It does need a startup process but is well worth it in a multi server environment.
You can do all the tasks that you can do in a Ant file as well as using the wsadmin script interface. I only update res env settings and the such in WAS and do not touch any props files for that reason since all settings are stored in WAS.
In my experience, a PAA is not a good method if you're merely importing a content library.
I don't think I understand why you are doing the import manually and not syndicating, but even if there's a good reason not to syndicate, the PAA process was too involved and required too many precursor actions (deleting libraries, remove PAA, deploy PAA and then activate the portliest) to be a viable option for something as simple as importing a WCM library.
Since activating the portlets I was importing with the PAA was an extra step, I don't believe you can restart applications either.

Can we schedule Selenium test cases

I have created Selenium Web Driver test cases and running it in Maven.
Can we schedule Selenium test cases to run at user-given date/time.
I googled and found few options like (1) creating a batch file & then adding it in Windows scheduler or (2) Using Jenkins
Somewhere, Quartz Scheduler was given.
Is there any other better method for it or which is the best method among these options.
Thanks !!!
We use Jenkins since it offers a variety of options to connect to different source control systems, allows building your requirement in variety of ways using scripts, maven commands, ant etc. You can create dependent jobs, use variety of plugins that add additional functionality to Jenkins, share reports, send emails ....and can schedule jobs on a variety of parameters as well like based on time or based on a commit in your repo or based on your build being deployed. I haven't used Quartz scheduler, so no opinions on that.

Resources