I'm just trying to simply connect to a neo4j database (on the desktop app) from a C# console app
My username is 'neo4j' and my password is 'root'
Most sites say do:
var client = new GraphClient(new Uri("http://localhost:7474"), "neo4j", "root");
client.Connect();
I then check with:
bool amIConnected = client.IsConnected // false
I was expecting to connect and then try some cypher queries in the code.
So it doesn't work (amIConnected is false, I get something about parsing or a 404 error). I've tried replacing http with neo4j or bolt but no joy.
I've also tried using var client = new Boltgraphclient...and bolt in the uri, but also no joy.
I've also tried using the Driver which is or isn't necessary, I don't know, but anyway, didn't work.
I've also tried tweaking the Uri to localhost:7474/data/db and I've also tried different ports.
Any help, much appreciated.
Related
For my application I need to use an open source calendar server. After some research I selected Bedework Server for my task. Basically what I want is to use this server to handle my application's calendar events. Even though I have setup a local server using quick start package, I kinda still confused on how I can use this. I can create events using it's web UI. But I want to use this as a service from my server (Something like a REST service). I read their documentation but I could not find anything that will help. I am really grateful if you can help me on this. Thanks in advance.
You can access the server using the CalDAV protocol. This is a standard REST protocol which specifies how you create/query/delete events and todos. It is the same protocol the Calendar or Reminders apps on OS X and iOS use to talk to the server.
The CalConnect CalDAV website is a good entry point to learn more about this.
If you are still looking this, you can try using any CalDAV Client Libraries -
CalDAV-libraries
I tried CalDAV4j library. For all basic use cases, it works fine.
There is also a demo github project on this library developed to list down the events in the server -
list-events-caldav4j-example
You can make use of the ListCalendarTest.java in the project and give appropriate endpoints to the Host configuration. For Example (for Bedework) -
HttpClient httpClient = new HttpClient();
// I tried it with zimbra - but I had no luck using google calendar
httpClient.getHostConfiguration().setHost("localhost", 8080, "http");
String username = "vbede";
UsernamePasswordCredentials httpCredentials = new UsernamePasswordCredentials(username, "bedework");
...
...
CalDAVCollection collection = new CalDAVCollection("/ucaldav/user/" + username + "/calendar",
(HostConfiguration) httpClient.getHostConfiguration().clone(), new CalDAV4JMethodFactory(),
CalDAVConstants.PROC_ID_DEFAULT);
...
...
GenerateQuery gq = new GenerateQuery();
// TODO you might want to adjust the date
gq.setFilter("VEVENT [20131001T000000Z;20131010T000000Z] : STATUS!=CANCELLED");
CalendarQuery calendarQuery = gq.generate();
I have been developing a node.js app that connects to a SQL Server database using the mssql module but I have run into a wall.
Basically, mssql seems to have some kind of bug where it simply crashes the app if the results of a query of any kind returns a certain number of records. Nothing too heavy. I'm talking about 50 to 100 records!
This is not query specific either. It is happening on ALL my queries, no matter what the results are.
The queries run fine if I limit them to return 10, 20, 40 records (using "SELECT TOP x ..."), but as soon as I increase the limit to a larger number of records, the app simply crashes without a single error message. No exceptions. Nothing.
The actual number of records where this starts to happen varies from query to query. It looks as if mssql has either a bug or a by-design limitation that affects the amount of data that it can return.
Am I missing something? Is there I setting I should be changing to avoid this? Alternatively, is there any other npm that I could use to connect to SQL Server?
Needless to say, this is a show-stopper for me. Should I abandon node.js altogether?
The point is, that if I cannot find a proper way to connect to SQL Server, then I will not be able to use node.js for this app and will have to switch to something else.
Thank you!
UPDATE 1
Here is part of the code that is causing this issue:
// Basic modules
var express = require("express");
var bodyParser = require("body-parser");
// Custom modules
var settings = require("./lib/settings.js").GetSettings();
var app = express();
app.use(bodyParser.json());
app.use('/', express.static(__dirname + "/public"));
/***************************************************************************************************************/
// Routes
app.get("/GetBrands", function(req, res) {
var sql = require('mssql');
var config = {
user: settings.DatabaseConfiguration.user,
password: settings.DatabaseConfiguration.password,
server: settings.DatabaseConfiguration.server,
database: settings.DatabaseConfiguration.database
};
var cmd = "SELECT TOP 5 * FROM Brands WHERE Status = 'a'";
var connection = new sql.Connection(config, function(err) {
// ... error checks
if (err) {
console.log(err);
}
// Query
var request = new sql.Request(connection); // or: var request = connection.request();
request.verbose = true;
request.query(cmd, function(err, recordset) {
// ... error checks
if (err) {
console.log(err);
}
console.log(recordset);
connection.close();
});
});
});
/***************************************************************************************************************/
// Enable the port listening.
app.listen(process.env.PORT || 8050);
If I change the the SQL statement that says "SELECT TOP 5 * ..." to a bigger number, like 60, 80 or 100, the app crashes. Also, the response is simply the name of each brand and an ID. Nothing too complicated or heavy.
UPDATE 2:
These are the steps I am following which always crash the app:
Run the app by typing in command-line: node app.js
In a web browser, go to http://localhost:8050/GetBrands. The very first time, I get the results just fine. No crashes.
Run it a second time. The app crashes. Every time.
I also discovered something else. I am using WebStorm for editing the code. If I start the debugger from there, I get no crashes, no issues whatsoever. The app works just as it should. It only crashes when I run it directly from command-line, or from WebStorm without the debugger listening... how crazy is this??
I tried applying the same command-line parameters that the WebStorm debugger uses but it made no difference.
I Hope somebody can shed some light soon because I am very close to dropping node.js altogether for this project thanks to this.
I am OK with switching to use a different SQL Server npm package, but which one then? I already tried mssql, node-sqlserver-unofficial and tedious, they all have the same issue so I am guessing that it is a problem with TDS.
By using the streaming interface, you can reduce your overhead significantly, and allow for better handling of very large query results. I use the streaming interface, for example with piping to export files (csv and xml).
With a relatively simple test, I'm able to crash node itself loading a very large array with 36-character strings (generated with uuid.v4()), at it happens for me around 1GB of use. My guess is there's a hard limit for a number of references allowed in in a running instance.
It used to work for me following this tutorial. Stopped working for a couple days, worked again, and now completely stopped working. I desperately need a program to help me manage my CloudSQL database and have tried multiple times following multiple tutorials and multiple setups with phpMyAdmin with no luck. I then tried phpMiniAdmin, SQL Buddy, and Adminer. I also tried this tutorial with no luck. To be clear, I am able to access the database using the following code in my own php scripts,
new mysqli('localhost','<username>','<password>','<db>',0,'/cloudsql/<app-id>:db');
It seems like the main problem is something to do with "socket" vs "host" in the Google App Engine setup. My "app.yaml" matches the tutorial, excluding references to my app, and my "config.inc.php" is the following,
<?php
$cfg['blowfish_secret'] = '<#>';
$i = 0;
$host = '/cloudsql/<app-id>:db';
$type = 'socket';
$i++;
$cfg['Servers'][$i]['auth_type'] = 'cookie';
$cfg['Servers'][$i]['socket'] = $host;
$cfg['Servers'][$i]['connect_type'] = $type;
$cfg['Servers'][$i]['compress'] = false;
$cfg['Servers'][$i]['extension'] = 'mysqli';
$cfg['Servers'][$i]['AllowNoPassword'] = true;
$cfg['McryptDisableWarning'] = true;
$cfg['PmaNoRelation_DisableWarning'] = true;
$cfg['ExecTimeLimit'] = 60;
$cfg['CheckConfigurationPermissions'] = false;
$cfg['UploadDir'] = '';
$cfg['SaveDir'] = '';
?>
I'm currently using 4.0.10, but tried 4.1.7 with the same results. I am greeted by a blank screen,there is source code but screen is blank. In the Apps Engine error logs I don't have any errors, just warnings referencing the inability to find images,
Static file referenced by handler not found: phpMyAdmin/pma_logo.png
I would love to know if anyone is currently using phpMyAdmin or any other php software to access CloudSQL, if so, maybe they might have a tip or two as to what is going on. I'm willing to share anything relevant as well, just let me know what and I'll post it.
EDIT: Got it working. I'm not sure why it initially stopped worked, started working, and then stopped working again. During my trails and error trying to get it to work I started using different tools including Codenvy as opposed to the Google Apps Engine client for Windows. I went back to using the GAE Windows client and was finally successful in accessing phpMyAdmin in CloudSQL. I don't know for sure, but I suppose it's possible the reason I struggled with strange errors during some of my testing could have been related to using Codenvy as opposed to Google's own tools. I'm not saing Codenvy was the problem, just that if a person is having phpMyAdmin/CloudSQL issues, they might consider trying the GAE client before pulling all their hair out.
The problem seemed to resolve itself. It's possible Google was doing some maintenance, I have no other explanation.
I am using play framework 1.2.7, gae module 1.6.0 and siena module 2.0.7 (also tested 2.0.6). This is a simple project that should run in play deployed on App Engine and connect to a MySQL database in Google Cloud SQL. My project runs fine locally but fails to connect to the database in production. Looking at the logs it looks like it is using the postgresql driver instead of the mysql one.
Application.conf
# db=mem
db.url=jdbc:google:mysql://PROJECT_ID:sienatest/sienatest
db.driver=com.mysql.jdbc.GoogleDriver
db.user=root
db.pass=root
This is the crash stack trace
play.Logger niceThrowable: Cannot connected to the database : null
java.lang.NullPointerException
at com.google.appengine.runtime.Request.process-a3b6145d1dbbd04d(Request.java)
at java.util.Hashtable.put(Hashtable.java:432)
at java.util.Properties.setProperty(Properties.java:161)
at org.postgresql.Driver.loadDefaultProperties(Driver.java:121)
at org.postgresql.Driver.access$000(Driver.java:47)
at org.postgresql.Driver$1.run(Driver.java:88)
at java.security.AccessController.doPrivileged(AccessController.java:63)
at org.postgresql.Driver.getDefaultProperties(Driver.java:85)
at org.postgresql.Driver.connect(Driver.java:231)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at play.modules.siena.GoogleSqlDBPlugin.onApplicationStart(GoogleSqlDBPlugin.java:103)
at play.plugins.PluginCollection.onApplicationStart(PluginCollection.java:525)
at play.Play.start(Play.java:533)
at play.Play.init(Play.java:305)
What is going on here? I am specifying the correct driver and url schema and it's using postgresql driver. Google Cloud SQL API access is enabled, the app is allowed to connect to the mysql instance, I am not using db=mem, ... I am stuck and can't figure out how to move forward! :-((
UPDATE: I thought I found the solution, but that was not the case. If I keep the %prod. prefix and create a war normally (or just don't define any DB properties), then the application will use Google DataStore instead of the Cloud SQL. If I create the war file adding --%prod at the end (or just delete the %prod. prefix in the application.conf), then it will keep failing to connect to the database showing the same initial error.
Any ideas please?
After being stuck for so long on this I just found the solution in no time after posting the question. Quite stupid actually.
The production environment properties in the application.conf file must be preceded by %prod. so the database config should read
%prod.db.url=jdbc:google:mysql://PROJECT_ID:sienatest/sienatest
%prod.db.driver=com.mysql.jdbc.GoogleDriver
%prod.db.user=root
%prod.db.pass=root
And everything runs fine.
EDIT: This is NOT the solution. The problem went away, but the app is using the DataStore instead of the Cloud SQL
At the end I ended doing a slight modification in play siena module source code and recompiling it.
In case anyone is interested, you will need to remove/comment/catch exception in this code around line 97 in GoogleSqlDBPlugin class:
// Try the connection
Connection fake = null;
try {
if (p.getProperty("db.user") == null) {
fake = DriverManager.getConnection(p.getProperty("db.url"));
} else {
fake = DriverManager.getConnection(p.getProperty("db.url"), p.getProperty("db.user"), p.getProperty("db.pass"));
}
} finally {
if (fake != null) {
fake.close();
}
}
For some reason the connection fails when initiated with DriverManager.getConnection() but it works when initiated with basicDatasource.getConnection(); which apparently is the way used by the module in the rest of the code. So if you delete the above block, and recompile the module everything will work as expected. If you are compiling with JDK 7, you will also need to implement public Logger getParentLogger() throws SQLFeatureNotSupportedException in the ProxyDriver inner class at the end of GoogleSqlDBPlugin file.
Strangely, I digged into the DriverManager.getConnection() and it looked like some postgresql driver is registered somehow, because otherwise I can't see why DriverManager.getConnection() would call to org.postgresql.Driver.connect().
I am using ReportViewer to show the reports on my windows WPF application using .Net 4.0. These reports are deployed on a separate SSRS 2008 report server and not the local machine. Right now, I am passing the credentials of the server in the following manner:
string userName = configClient.Settings.Get("UserName").Value.ValueXml.InnerText;
string password = configClient.Settings.Get("Password").Value.ValueXml.InnerText;
string domain = configClient.Settings.Get("Domain").Value.ValueXml.InnerText;
IReportServerCredentials irsc = new ReportViewerCredentials(userName, password, domain);
_reportViewer.ServerReport.ReportServerCredentials.NetworkCredentials = irsc.NetworkCredentials;
Also, I am using the following settings with the ReportViewer if it is of any use:
_reportViewer.ProcessingMode = ProcessingMode.Remote;
_reportViewer.ShowParameterPrompts = false;
_reportViewer.ServerReport.ReportServerUrl = new Uri(Properties.Settings.Default.ReportServer);
_reportViewer.ServerReport.ReportPath = Properties.Settings.Default.Reports;
I am using the config file to save and retrieve the credentials for the server access, but I do not think this is a secure way of doing it. I would like to implement this in a secure way where I do not need to take the credentials from the user or from the config file. Both the local machine and the server would be on the same network.
I am not sure how to do it, can this be done through impersonation, I am just guessing as I do not have much idea about security and impersonation. Also, if it can be done, can I get a sample or may be a link to an article through which I can get this done.
The core idea is to avoid storing the username and password on the client. I searched for solution but what I got was very vague in nature.
Please do not close this thread as it is an open question, but it is important for me, as I am approaching deadline and working on too many things at a time. Sorry for inconvenience caused if any.
I have found a way to not pass the credentials at all and still be able to retrieve the reports from the Reports Server, but not without help.
I have configured a new Role assignment in the Report Manager using the URL http://localhost/Reports/Pages/Folder.aspx. Go to Properties tab, and add a New Role Assignment and add Everyone and provide the required right to it.
This way, I would not need to pass any credentials from the client for the ReportViewer. Also, it can be used to configure the access for a selected users.