NpgsqlFuzzyStringMatchDbFunctionsExtensions - npgsql

Can someone please tell me which version of npgsql NpgsqlFuzzyStringMatchDbFunctionsExtensions was added?
I am using Npgsql.EntityFrameworkCore.PostgreSQL 3.1.18 and don't have access to this class.
There are no results for "NpgsqlFuzzyStringMatchDbFunctionsExtensions" on Stack Overflow.
Thank you.

Fuzzy string matching exists in 3.1, but requires that you reference the Npgsql.EntityFrameworkCore.PostgreSQL.FuzzyStringMatch nuget package (it was merged into the main package for 6.0, see this tracking issue).

Related

How to update statsmodels to 0.13.0.dev0 version (to use OrderedModel module)?

What I was trying to do?
I was trying to analyze data using ordinal logistic regression. For that, I tried to import OrderedModel from statsmodels.miscmodels.ordinal_model as suggested by this doc.
Then, what is the problem?
After execution of the above mentioned import statement, I got the following error.
No module named 'statsmodels.miscmodels.ordinal_model'
How did I try to solve the problem?
First of all, I checked the statsmodels version, I am using. I find that I am using the latest version (0.12.1), available in Anaconda. From this doc, I perceive that I will need to use 0.13.0.dev0 version to get the OrderedModel module, as in v0.12.1, there is no folder/file named OrderedModel. However, I do not find any way to update the statsmodels to 0.13.0.dev0 version.
Then, my question
How can I update statsmodels to 0.13.0.dev0 version so that I can use OrderedModel module?
Note: I know that in Python, there are some other ways to do ordinal logit regression. However, I want to use statsmodels due to it's nice summary of analysis.
Thanks in advance!
You can install a recent build from the nightly wheel repository hosted on Anaconda.org.
Run pip install -i https://pypi.anaconda.org/scipy-wheels-nightly/simple statsmodels.
It looks like you will need to compile from the GitHub. See prior related question here:
How to update to the developer version of statsmodels using Conda?

F# SQL Server Type Provider .NET SDK tools not found. Windows 10

Running Windows 10, Visual Studio Community 2015, and SQL Server 2014 Express. I also have .Net 3.5, 4.0, 4.5 installed.
My SqlDataConnection is throwing the compile time error "The type provider 'Microsoft.FSharp...' ... Error reading schema. The .NET SDK 4.0 or 4.5 tools could not be found". Searching for solutions I got directed to some registry keys.
In
HKLM\SOFTWARE\Wow6432Node\Microsoft\Microsoft SDKs\Windows
I have two keys: \v8.0A and \v8.1A. (I didn't have a key for any v7.xxx) I got to these keys from one online answer to this issue. Each of those keys has three keys under it: WinSDK-NetFx...Tools. For v8.0A the ellipses are "35" and for v8.1A they are "40". Plus each key is repeated with + "-x64" and + "-x86".
These six keys all have string values of "InstallationFolder" (as well as "ComponentName" and "ProductVersion").
I go to the installation folders, and each one has ResGen.exe and SqlMetal.exe (which are the files other answers said to look for). So it seems like I have the requisite registry keys which point to the requisite exe's.
Next to the installation folders for v7.0A and v8.1A, I also have one for v10.0A. So I tried creating some additional registry keys named v10.0A. In those v10.0A keys I tried putting the v10.0A folder as the InstallationFolder and I also tried putting the v8.1A folder as the installationFolder. (One reason I tried this permutation is because the error message asks for SDK 4.0 or 4.5, whereas the v10.0A folder has NETFX 4.6 Tools. So I tried to resolve possible inconsistency between v4.5 and v4.6 by varying the registry keys under v10.0A and using the path to v4.5 in the v8.1A folder.)
I've probably gone on too long trying to give the pertinent info. But I have the most current software installed and updated, and I"m trying to follow previous solutions given. And I'm stuck. Maybe all the newest versions aren't yet gotten tied together with each other?
Any help appreciated much.
Edit: Doing some more web searching, I'm now finding more fixes with more details. This one in particular looks very promising:
Need clarification regarding Microsoft.FSharp.Data.TypeProviders
But I won't get to work on it until the end of today. So that link may be my answer, but I can't confirm for a bit.
Worth checking you are using the nuget version of the F# type providers. The version bundled with the framework does not have the fix applied that extends the list of registry keys searched to cope with later versions of the SDK (as would be installed on Windows 10)
https://github.com/fsprojects/FSharp.Data.TypeProviders/issues/21#issuecomment-337444919

What are the new changes in Apache POI 3.9 ? Memory leakage issue in 3.9?

In Apache POI 3.9 version release they are telling the memory leak and creating temp file is fixed (bug : 53493). But how to use that? Is there any changes for the importing packages in 3.9 version compared to 3.8? If the changes occur, then what are they?
The change log for Apache POI is available online. To see the changes between 3.8 and 3.9, look between here and here
Unless otherwise detailed in the release notes included in the download, you should be fine to just drop the new jars in, in place of the old ones. Make sure you really remove the old ones though! You get all sorts of odd things going wrong if you have both old and new POI jars on your classpath

Jackrabbit locks up with many open ACEs

I am running into an issue where a lot of processes block due to having more than 1000 access control entries active at a time; this is a known issue in Jackrabbit; a work-around has been identified and rolled out into 2.4.1, but CQ 5.5 / CRX 2.3 uses Jackrabbit 2.4.0. Are there any workarounds available under 2.4.0?
I ran into this article that refers to CRX 2.2. http://helpx.adobe.com/crx/kb/cacheentrycollector-cache-size-is-too-small.html
The resolution says to install CRX hotfixpack 2.2.0.56. This makes CachingEntryCollector configurable. via a JVM parameter:
-Dorg.apache.jackrabbit.core.security.authorization.acl.CachingEntryCollector.maxsize=10000
I have not been able to locate hotfix 2.2.0.56, but the solution is showing in 2.2.0.68.
This has been addressed before. The question is, did this make it into CRX 2.3. I am still digging through CQ, looking for org.apache.jackrabbit.core, to see if this fix made it to the new version.
update:
Sadly, this change did not make it in to 2.3.

integrating KATTA And SOLR using SOLR-1395

I am trying to use the patch provided for the integration of Katta and SOLR SOLR-1395
Can anybody help me to figure out the versions of KATTA patch & Katta trunk to be used for this?
Currently i have used;
katta-80-1.patch with Katta 0.6 trunk.
I am unable to apply the above patch, it gives me error while applying the patch.
Anyone who is worked on it, plesae help me to resolve this
Thanks
Vipin
The Patch KATTA-80 already seems to be applied to the Fixed version and Status resolved.
So I suppose the changes should already be included in the trunk version.
Also, the comment mentions katta-80-1.patch works against trunk as of the date of the patch. which was way back in 2009 !! So not sure if it can be applied now.
You can check the changes by opening the patch and applying it manually if you are able to find the files with no differences.
SOLR-1395 should be applicable on the trunk.
You can check for the revision number in the patch file to track it to the trunk revision number.

Resources