What is included in the ‘standalone’ vs ‘cluster’ Milvus deployments?
Standalone makes it sound like it is a single instance, but as far as I can tell it still requires deploying ETCD and MinIO.
A description of embedded Milvus is given here: Using Embedded Milvus to Instantly Install and Run Milvus with Python
A summary of the differences between Embedded, Standalone, Cluster and Cloud is also given.
Related
I am a bit confused about installing SOLR 5.5. The 5.5 manual says it is not recommended to deploy solr war on any other web/application servlet containers. It says deploy solr as a stand alone server. What does this mean? SOLR running on port 8983 is not running out of jetty? SOLR itself is a web server? How stable will it be in a prod environment?
I followed all the instructions on https://cwiki.apache.org/confluence/display/solr/Installing+Solr
https://cwiki.apache.org/confluence/display/solr/Taking+Solr+to+Production
Appreciate if someone can share their experience.
It's using Jetty, but it might not in the future (as there are quite a few things that gets easier to implement when you don't have to consider the webapp framework around your service). That's why 5.0 dropped explicit support for running in an existing container (it can still be done - but you're on your own, and features may break). At least now the environment for the application can be assumed to be a certain container and have a specific feature set.
I have started to try to use the Google Cloud datalab. While I understand it is a Beta product, I find the Doc's very frustrating, to say the least.
The questions here and lack of responses as well as lack of new revisions or docs over the several months the project has been available make me wonder if there is any commitment to the product?
A beginning would be a notebook that shows data ingestion from external sources to both the datastore system and the Big query system. That is a common use case. I'd like to use my own data, it would be great to have a Notebook to ingest it. It seems that should be doable without huge effort? And it would get me (and others) out of this mess trying to link the various terse docs from various products and workspaces up and working together..
in addition to a better explanation of the Git hub connection process (prior question))
For BigQuery, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/BigQuery/Importing%20and%20Exporting%20Data.ipynb
For GCS, see here: https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/tutorials/Storage/Storage%20Commands.ipynb
Those are the only two storage options currently supported in Datalab (which should not be used in any event for large scale data transfers; these are for small scale transfers that can fit in memory in the Datalab VM).
For Git support, see https://github.com/GoogleCloudPlatform/datalab/blob/master/content/datalab/intro/Using%20Datalab%20-%20Managing%20Notebooks%20with%20Git.ipynb. Note that this has nothing to do with Github, however.
As for the low level of activity recently, that is because we have been heads down getting ready for GCP Next (which happens this coming week). Once that is over we should be able to migrate a number of new features over to Datalab and get a new public release out soon.
Datalab isn't running on your local machine. Just the presentation part is in your browser. So if you mean the browser client machine, that wouldn't be a good solution - you'd be moving data from the local machine to a VM which is running the Datalab Python code (and this VM has limited storage space), and then moving it again to the real destination. Instead, you should use the cloud console or (preferably) gcloud command line on your local machine for this.
I am looking for a solution for a script language like lua to use with Go application running on GAE. I have found golua and luar projects and planned to use them as the solution.
However, once I ran them on GAE environment, I encountered
"o-app-builder: Failed parsing input: parser: bad import "unsafe" in
github.com/stevedonovan/luar/luar.go"
I was confused but finally found that apparently GAE trimmed unsafe package out for a reason. Since luar and golua need the package, I think I have to find a new solution for this.
Is there any way to use luar and golua on GAE? If it is not possible, are there any alternative script languages that will run on GAE environment?
The just announced Managed VMs will allow you to run App Engine applications on Compute Engine virtual machines. This allows access to the full range of libraries, filesystems and sockets. As of today (April 20th, 2014) Managed VMs are in Limited Preview, so you'll need to fill out the form here to get access.
I am about to install Solr on a production box. It will be the only Java applet running and be on the same box as the web server (nginx).
It seems there are two options.
Install Jetty separately and configure to use with Solr
Set Solr's embedded Jetty server to start as a service and just use that
Is there any performance benefit in having them separate?
I am a big fan of KISS, the less setup the better.
Thanks
If you want KISS there is no question: 2. stick to vanilla Solr distrib with included jetty.
Doing the work of installing an external servlet engine would make sense if you needed Tomcat for example, but just to use the same thing (Jetty) Solr already includes...no way.
Solr is still using jetty 6. So there would be some benefits if you can get the solr application to run in a recent jetty distribution. For example you could use jetty 9 and use features like SPDY to enhance the response times of your application.
However I have no idea or experience if it's possible to run the solr application standalone in a servlet engine.
Another option for running Solr and keeping it simple is to use Solr-Undertow which is a high performance with small footprint server for Solr. It is easy to use on local machines for development and also production. It supports simple config files for running instances with different data directories, ports and more. It also can run just by pointing it at a distribution .zip file without needing to unpack it.
(note, I am the author of Solr-Undertow)
Link here: https://github.com/bremeld/solr-undertow with releases under the "Releases" tab.
We are developing some testing infrastructure and I have hit a coders block (lack of sleep?)...this seems like it would be a solved problem but I haven't found what I'm looking for via google.
I would like to automatically push builds from our CI server (TeamCity) to a number of machines (growing, but currently 30). These are several WinForms apps and a number of dlls. Once deployed, I would like to kick off tests (NUnit, for both unit and integration tests) and report all results (back to CI? or somewhere else? Not sure).
The target machines are a number of platforms (Win7,Vista, XP, Server 2k8, Server 2k3, Ubuntu, Fedora, Suse, x64, x86, maybe macs down the line)
This gets me part way there (the actual push). But I can't find existing solutions for 'push starting' the tests and reporting back. So far I am thinking of combining the link (or similar) with custom code running on each client machine that watches the deploy directory, runs the tests and reports the results.
Does anyone know of existing solutions?
Links?
Done something similar and care to share?
Edit
If possible, we prefer .net based solutions, but it isn't strictly necessary. I would have tagged the question as such, but ran out of tags :)
You could use KwateeSDCM to both push and start on all the platforms you mention, including mac. However, you'll have to do some coding to get reports out. I'm not familiar with TeamCity but maybe you could push a script along with your application which could then transfer the test results via ftp to a server accessible by TeamCity.
Have a look at: STAF (Software test Automation Framework)
The Software Testing Automation Framework (STAF) is an open source, multi-platform, multi-language framework designed around the idea of reusable components, called services (such as process invocation, resource management, logging, and monitoring).
Which includes STAX:
STAX is an execution engine which can help you thoroughly automate the distribution, execution, and results analysis of your testcases.
And there's an article here:
http://agiletesting.blogspot.com/2004/12/stafstax-tutorial.html
Assuming you have the push part done already, and you don't mind using a TeamCity license, you can create a TeamCity Command Line Runner build configuration or NUnit test configuration that kicks off the tests on a properly configured agent. The build trigger for this test config would be successful completion of the application build.
So far I have ended up using a seperate build step in TeamCity that executes a bat script that in turn fires of tasks to the list of machines using PsExec. So far my trial runs it is working ok, though I now need to parallelize the copying of build output...
Thanks for the input to those who have provided it.