I have been following below article to set up Solr and stuck at Step 1
http://www.bugdebugzone.com/2016/11/part-2-sitecore-82-with-solr-62.html
I have installed solr-7.2.1-0 with latest Bitnami software ( https://bitnami.com/stack/solr ) and now looking for basic_configs folder but I couldn't find. configsets folder contains only two folder
C:\Bitnami\solr-7.2.1-0\apache-solr\server\solr\configsets
_default
sample_techproducts_configs
Now how can I complete step 1
The installation process will create several sub-directories under the installdir (C:\Bitnami\solr-7.2.1-0\) directory:
Servers and related tools: apache2\, mysql\, postgresql\, apache-tomcat\, etc.
Languages: php\, python\, ruby\, tcl\, etc.
Application files: apps\phpMyAdmin\, apps\drupal\, apps\joomla\, apps\solr\, etc.
Common libraries: common\
Licenses of the components included in the stack: licenses\
Application files are stored in the C:\Bitnami\solr-7.2.1-0\apps\solr\htdocs directory. The configuration file for the Apache Web server is stored in the C:\Bitnami\solr-7.2.1-0\apps\solr\conf\ directory.
Related
I have deployed a ReactJS application with neo4j database on CentOS 7 server. Neo4j version is 4.4.2. The application also uses apoc library. So i added apoc-4.4.0.1-all.jar file to the /var/lib/neo4j/plugins directory on the server. Then i did following-
chown neo4j:neo4j apoc-4.4.0.1-all.jar
chmod 755 apoc-4.4.0.1-all.jar
Modify /etc/neo4j/neo4j.conf file to
dbms.security.procedures.whitelist=apoc.coll.*,apoc.load.*,apoc.*
dbms.security.procedures.unrestricted=apoc.*
uncomment dbms.directories.plugins=/var/lib/neo4j/plugins
systemctl restart neo4j
After deploying the project, when i open the application on the browser and insert some values in a form, it shows following error-
Unknown function 'apoc.map.submap' (line 3, column 14 (offset: 56)) " WHERE apoc.map.submap(properties(n), keys(obj), [], false) = obj" ^
Did i miss anything in apoc configuration ?
Try changing the plugin directory from: /var/lib/neo4j/plugins to /var/lib/neo4j/graph.db/plugins directory then restart the neo4j server. You need to create the folder if not found.
If you install neo4j using an installer rather than download/untar/unzip a zipped/tar file; then neo4j server is looking at the apoc under default.graphdb folder
related: APOC is only partially installing its extension in neo4j (one procedure)
The ownership of /var/lib/neo4j/data folder has to be neo4j, not root
i'm new in azure devops pipeline.
I have three C project in Azure Repos Git and I've configured a linux self-hosted agents.
The C_project_3 depends from .h and .a files of C_project_2 which in turn depends on .h and .a files of C_project_1.
The C_project_1 needs to build a not versioned file stored on the agent.
Is it possible configure the YAML file, of each project, to start the building process on cascade resolving the dependencies of .h, .a and extern file?
i have found the solution for question "The C_project_1 needs to build a not versioned file stored on the agent"
The sources of project are loaded on the agent in folder _work/1/s.
The not versioned file must be stored there.
I found the answer here: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops
In detail:
Single repository: Your source code is checked out into a directory called s located as a subfolder of (Agent.BuildDirectory).
If (Agent.BuildDirectory) is C:\agent_work\1 then your code is
checked out to C:\agent_work\1\s.
In a SQL Server Data Tools (SSDT) project in Visual Studio, we have a "core" set of SQL objects that are included in each SQL project we do - kind of like a class library. We keep these "core" SQL objects in a separate Git repo, and then include them in other projects as a Git submodule.
Once the "core" submodule is linked to the main project, we include the submodule files in our .SQLPROJ file like so:
<Content Include="..\CoreSubmodule\ProjectFolder\Scripts\**\*.*">
<Link>Scripts\%(RecursiveDir)%(FileName)%(Extension)</Link>
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
This works great for regular .sql files in project - they show up with a special icon in Visual Studio indicating that it's a referenced file, and the build engine is able to resolve references/dependencies just fine. Where we're running into a snag, though, is with our Pre- and Post-Deployment scripts.
We have a series of "core" Pre- and Post-Deployment master scripts that are common among projects, which we've just introduced into our "core" submodule. Here's how the directory structure looks at a high-level:
/Scripts/
/PostDeploy/
_PostDeployMaster.sql
/ReferenceData/
ReferenceDataScript1.sql
In the above structure:
_PostDeploymentMaster.sql is a local file in the project, and is set to Build Action = "PostDeploy". It has a reference to the *ReferenceDataScript1.sql file
ReferenceDataScript1.sql is a reference to a file that physically
exists in the submodule directory (one level up from our project),
and is set to Build Action = "None"
Note that Visual Studio displays it in the /ReferenceData/ folder as a linked file
The _PostDeploymentMaster script references other sub-scripts via a SQLCMD reference:
:r .\ReferenceData\ReferenceDataScript1.sql
go
Trying to build the project in this manner produces a SQL72001 error in Visual Studio ("The included file does not exist"). Obviously if we physically place the ReferenceDataScript1.sql file in the directory (without having a reference), it builds just fine.
Options we've explored include having a non-Build "buffer" script between the PostDeploy master and the core subscripts (same error), and having pre and post build actions set to physically copy files back and forth from the submodule to the project to satisfy the build engine (a little too hacky for our taste).
Has anyone run into this issue, or have a serviceable workaround?
We wound up working around this issue by using Peter Schott's suggested fix in the original question's comments - using relative paths back to the submodule on disk instead of the "virtual" link inside of the actual Visual Studio SQL project.
I was searching how can I organize SSDT project with Submodules and found your question.
I did some realizations, but I used "virtual" link inside of the actual Visual Studio SQL project. Here is my test projects.
https://github.com/gnevashev/Core
https://github.com/gnevashev/Installation01
In .sqlproj, I added:
<Build Include="Core\**\*.sql" Exclude="Core\**\*PostDepl*.sql" />
And in project PostDeploy Script, I added link on Core PostDeploy Script:
:r .\Core\Script.PostDeploymentPopulateData.sql
I am trying to install the jasper WAR file manually for the community edition on a Tomcat server on Unix OS. I am following the steps from the documentation skipping the sample databases.
js-ant create-js-db
js-ant init-js-db-ce
js-ant import-minimal-ce
js-ant deploy-webapp-ce
I am able to successfully build the first two steps however step 3 fails. Here is the complete log https://gist.github.com/shruti-palshikar/d3f75f1157028e963a3c
Are there any configurations that are needed prior to this build that I am missing? Any help is appreciated!
I have done WAR installation on windows machine , see if this could help you:-
Download the following files:-
The WAR file distribution comes in file named jasperreports-server-cp-4.5.0-bin.zip
in the compressed ZIP format. Download the WAR distribution file from
http://sourceforge.net/projects/jasperserver/files/JasperServer/JasperServer%204.5.0/ .
Download the JDBC driver, mysql-connector-java-5.1.18-bin.jar
http://dev.mysql.com/downloads/connector/j/
After downloading WAR file download Apache Tomcat exe file and install on your system.
There are two way to install JasperReport server by manually or auto, but we are installing it manually.
Before start installing you need to set JAVA_HOME environment variable on your system.
JAVA_HOME="path of jdk folder"
e.g: JAVA_HOME=C:\Program Files\jdk1.7.0_01
To install the WAR file distribution using the manual buildomatic steps:-
1- If you’re using MySQL, place a MySQL JDBC driver in
<js-install>/buildomatic/conf_source/db/mysql/jdbc.
2- Start database (MySQL) server.
3- Stop your Application server(Apache tomcat).
4- Copy mysql_master.properties file from
<js-install>/buildomatic/sample_conf
and paste it to
<js-install>/buildomatic and rename it to default_master.properties.
5- Edit default_master.properties file and change the setting of database server and application server according to your system.
6- Open a Command Prompt as Administrator on Windows and go to buildomatic directory of JasperReport server and run these commands:-
a:- js-ant create-js-db (Creates the JasperReports Server repository database)
b:- js-ant create-sugarcrm-db (Optional) Creates the sample databases
js-ant create-foodmart-db
c:- js-ant load-sugarcrm-db (Optional) Loads sample data into the sample databases
js-ant load-foodmart-db
js-ant update-foodmart-db
d:- js-ant init-js-db-ce
js-ant import-minimal-ce
(Initializes the jasperserver database, loads core application data. Running js-ant import-minimal-ce is mandatory. The server cannot function without this data)
e:- js-ant import-sample-data-ce (Optional) Loads the demos that use the sample data
f:- js-ant deploy-webapp-ce Configures and deploys the WAR file to Tomcat
Start the application serve.
You can see the post Here
I'm having trouble finding documentation regarding this. After some googling I find that bin, conf,logs, temp, webapps, work are directories that should exist in CATALINA_BASE.
temp, logs, webapps, bin and work I don't have any trouble understanding.
bin I suppose is just another bin folder, if for some reason both CATALINA_HOME and CATALINA_BASE are in PATH, then scripts in both folders will be available for execution.
But how about conf? Will the content of CATALINA_HOME/conf be totally ignored if CATALINA_BASE is set? Suppose I only would need to customize only a few config files pr. CATALINA_BASE, would I still need to keep a complete set of config files in CATALINA_BASE/conf, or could the standard config files in CATALINA_HOME/conf be shared?
And ditto for CATALINA_BASE/lib ... would this work as a "global" lib folder pr. instance?
You can find the answer in the Tomcat documentation:
http://tomcat.apache.org/tomcat-6.0-doc/RUNNING.txt
Advanced Configuration - Multiple Tomcat Instances
In many circumstances, it is desirable to have a single copy of a
Tomcat binary distribution shared among multiple users on the same
server. To make this possible, you can set the $CATALINA_BASE
environment variable to the directory that contains the files for your
'personal' Tomcat instance.
When you use $CATALINA_BASE, Tomcat will calculate all relative
references for files in the following directories based on the value
of $CATALINA_BASE instead of $CATALINA_HOME:
bin - Only setenv.sh (*nix), setenv.bat (windows) and tomcat-juli.jar
conf - Server configuration files (including server.xml)
logs - Log and output files
webapps - Automatically loaded web applications
work - Temporary working directories for web applications
temp - Directory used by the JVM for temporary files (java.io.tmpdir)
Note that by default Tomcat will first try to load classes and JARs
from $CATALINA_BASE/lib and then $CATALINA_HOME/lib. You can place
instance specific JARs and classes (e.g. JDBC drivers) in
$CATALINA_BASE/lib whilst keeping the standard Tomcat JARs in
$CATALINA_HOME/lib.
If you do not set $CATALINA_BASE, $CATALINA_BASE will default to the
same value as $CATALINA_HOME, which means that the same directory is
used for all relative path resolutions.