I might be missing something but there doesn't seem to be anything in the official documentation, so I'm stuck asking here;
I'm trying to set up some stuff asynchronously (connections to databases, etc) when the application starts. I want this to complete before the app becomes ready to accept connections (for obvious reasons).
Is there a way to do this in Express 4.x?
Here is an example of basically what I want, however it's unsupported and 4 years out of date at this point.
Any help is appreciated.
Edit: I feel I should point out that I used express-generator to setup my application, so all the server listening is done inside bin/www. Should I just modify this file? Or can I control it from app.js?
For example you can use promises, something like this:
var express = require('express');
var app = express();
var Q = require('q');
var d = Q
mongoose.connect('mongodb://localhost/db', function(){
return d.resolve();
});
d.promise.then(function(){
app.listen(8080);
});
Related
I am trying to make Facebook like "Link sharing" module i.e when anyone write a link while doing new POST , it will automatically show some basic data from the website like in facebook...
I tried simple scraping using $http.get and it is working only if I install CORS extension in google chroome so the main issue I am facing with this approach is to make is working without using any plugin for it...
I also tried by adding headers in config file but still no luck.
$httpProvider.defaults.headers.common = {};
$httpProvider.defaults.headers.post = {};
$httpProvider.defaults.headers.put = {};
$httpProvider.defaults.headers.patch = {};
$httpProvider.defaults.useXDomain = true;
delete $httpProvider.defaults.headers.common['X-Requested-With'];
Please share me the best approach to do this feature or if there is any way I can solve CORS issue ?
Thanks
Zeshan
This is not possible. CORS exists for a reason: to STOP you from accessing HTTP resources from other domains without those other domains explicitly allowing you to.
Again: this is not possible due to security restrictions imposed by browsers.
The only way you can accomplish this, and the way Facebook does it, is to move those cross-domain requests to a server, where there are no cross-domain restrictions.
So $http.post('/some-script-on-my-server') where that script does the actual HTTP request for the remote page, scrapes the necessary information and returns it back to the browser.
There is a workaround for this in order to have an only browser working javascript solution without configuring anything in the server (maybe useful in some particular situation) and "avoiding" CORS.
You could use the YQL. This way you only have to play in their console a little bit with the url you need to scrape and use the query they provide you with in your website as url for your request.
For example (extracted from YQL website):
select * from html where url='http://finance.yahoo.com/q?s=yhoo' and xpath='//div[#id="yfi_headlines"]/div[2]/ul/li/a'
Gets the headlines from yahoo finance, and you get also the query url that you can use in your request:
https://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20html%20where%20url%3D'http%3A%2F%2Ffinance.yahoo.com%2Fq%3Fs%3Dyhoo'%20and%20xpath%3D'%2F%2Fdiv%5B%40id%3D%22yfi_headlines%22%5D%2Fdiv%5B2%5D%2Ful%2Fli%2Fa'&format=json&diagnostics=true&callback=
They have other examples and how to integrate it in their documentation.
You don't need to configure anything in the server side, but of course, it has to go through Yahoo's, which isn't at all optimal. Of course, performance gets directly affected...
As said again, maybe in some particular situations (dev, tests, etc) this can be useful and it is always interesting to give it a try.
I have a node.js server and I am using socket.io for realtime communication between the server and clients. I have observed that if a mobile client (Using Ionic Framework) disconnects suddenly, without letting the server know about it, the sockets are alive for hours (or forever). I have read and looked on their documentation and they have options like
pingInterval, pingtimeout, heartbeat interval, heartbeat timeout, close timeout.
How do I configure these values on my server?
Which of these values have been deprecated?
Here is my code.
var express = require('express');
var app = express();
var server = require('http').createServer(app);
var io = require('socket.io').listen(server);
io.set('heartbeat interval', 5000);
io.set('heartbeat timeout', 8000);
io.set('timeout', 5000);
io.on('connection', function(socket){...}
None of these seem to work. Any help or guidance is highly appreciated.
PS: I am splicing sockets of my collection when a client disconnects
and it works just fine when clients tell server that they want to
disconnect gracefully.
You have to use pingTimeout as mentioned here:
https://github.com/Automattic/socket.io/issues/1900
Also make sure to set your options like the following, since io.set is gone.
var io = require('socket.io')({
pingTimeout: 5000
})
More at: http://socket.io/docs/migrating-from-0-9/#configuration-differences
However, if this doesn't work, chances are ionic is actually keeping the connection in the background. From a project around a year ago, I remember having multiple issues like this and ended up making ionic disconnect socket forcefully when it went in the background.
For my application I need to use an open source calendar server. After some research I selected Bedework Server for my task. Basically what I want is to use this server to handle my application's calendar events. Even though I have setup a local server using quick start package, I kinda still confused on how I can use this. I can create events using it's web UI. But I want to use this as a service from my server (Something like a REST service). I read their documentation but I could not find anything that will help. I am really grateful if you can help me on this. Thanks in advance.
You can access the server using the CalDAV protocol. This is a standard REST protocol which specifies how you create/query/delete events and todos. It is the same protocol the Calendar or Reminders apps on OS X and iOS use to talk to the server.
The CalConnect CalDAV website is a good entry point to learn more about this.
If you are still looking this, you can try using any CalDAV Client Libraries -
CalDAV-libraries
I tried CalDAV4j library. For all basic use cases, it works fine.
There is also a demo github project on this library developed to list down the events in the server -
list-events-caldav4j-example
You can make use of the ListCalendarTest.java in the project and give appropriate endpoints to the Host configuration. For Example (for Bedework) -
HttpClient httpClient = new HttpClient();
// I tried it with zimbra - but I had no luck using google calendar
httpClient.getHostConfiguration().setHost("localhost", 8080, "http");
String username = "vbede";
UsernamePasswordCredentials httpCredentials = new UsernamePasswordCredentials(username, "bedework");
...
...
CalDAVCollection collection = new CalDAVCollection("/ucaldav/user/" + username + "/calendar",
(HostConfiguration) httpClient.getHostConfiguration().clone(), new CalDAV4JMethodFactory(),
CalDAVConstants.PROC_ID_DEFAULT);
...
...
GenerateQuery gq = new GenerateQuery();
// TODO you might want to adjust the date
gq.setFilter("VEVENT [20131001T000000Z;20131010T000000Z] : STATUS!=CANCELLED");
CalendarQuery calendarQuery = gq.generate();
I have been developing a node.js app that connects to a SQL Server database using the mssql module but I have run into a wall.
Basically, mssql seems to have some kind of bug where it simply crashes the app if the results of a query of any kind returns a certain number of records. Nothing too heavy. I'm talking about 50 to 100 records!
This is not query specific either. It is happening on ALL my queries, no matter what the results are.
The queries run fine if I limit them to return 10, 20, 40 records (using "SELECT TOP x ..."), but as soon as I increase the limit to a larger number of records, the app simply crashes without a single error message. No exceptions. Nothing.
The actual number of records where this starts to happen varies from query to query. It looks as if mssql has either a bug or a by-design limitation that affects the amount of data that it can return.
Am I missing something? Is there I setting I should be changing to avoid this? Alternatively, is there any other npm that I could use to connect to SQL Server?
Needless to say, this is a show-stopper for me. Should I abandon node.js altogether?
The point is, that if I cannot find a proper way to connect to SQL Server, then I will not be able to use node.js for this app and will have to switch to something else.
Thank you!
UPDATE 1
Here is part of the code that is causing this issue:
// Basic modules
var express = require("express");
var bodyParser = require("body-parser");
// Custom modules
var settings = require("./lib/settings.js").GetSettings();
var app = express();
app.use(bodyParser.json());
app.use('/', express.static(__dirname + "/public"));
/***************************************************************************************************************/
// Routes
app.get("/GetBrands", function(req, res) {
var sql = require('mssql');
var config = {
user: settings.DatabaseConfiguration.user,
password: settings.DatabaseConfiguration.password,
server: settings.DatabaseConfiguration.server,
database: settings.DatabaseConfiguration.database
};
var cmd = "SELECT TOP 5 * FROM Brands WHERE Status = 'a'";
var connection = new sql.Connection(config, function(err) {
// ... error checks
if (err) {
console.log(err);
}
// Query
var request = new sql.Request(connection); // or: var request = connection.request();
request.verbose = true;
request.query(cmd, function(err, recordset) {
// ... error checks
if (err) {
console.log(err);
}
console.log(recordset);
connection.close();
});
});
});
/***************************************************************************************************************/
// Enable the port listening.
app.listen(process.env.PORT || 8050);
If I change the the SQL statement that says "SELECT TOP 5 * ..." to a bigger number, like 60, 80 or 100, the app crashes. Also, the response is simply the name of each brand and an ID. Nothing too complicated or heavy.
UPDATE 2:
These are the steps I am following which always crash the app:
Run the app by typing in command-line: node app.js
In a web browser, go to http://localhost:8050/GetBrands. The very first time, I get the results just fine. No crashes.
Run it a second time. The app crashes. Every time.
I also discovered something else. I am using WebStorm for editing the code. If I start the debugger from there, I get no crashes, no issues whatsoever. The app works just as it should. It only crashes when I run it directly from command-line, or from WebStorm without the debugger listening... how crazy is this??
I tried applying the same command-line parameters that the WebStorm debugger uses but it made no difference.
I Hope somebody can shed some light soon because I am very close to dropping node.js altogether for this project thanks to this.
I am OK with switching to use a different SQL Server npm package, but which one then? I already tried mssql, node-sqlserver-unofficial and tedious, they all have the same issue so I am guessing that it is a problem with TDS.
By using the streaming interface, you can reduce your overhead significantly, and allow for better handling of very large query results. I use the streaming interface, for example with piping to export files (csv and xml).
With a relatively simple test, I'm able to crash node itself loading a very large array with 36-character strings (generated with uuid.v4()), at it happens for me around 1GB of use. My guess is there's a hard limit for a number of references allowed in in a running instance.
I've been previously using tools such as Google Website Optimizer to run multi-variation or A/B tests. However right now I am looking for a solution that works for a larger site (400-500 000 unique visitor per month) with a very locked down source code environment. Basically:
The site is balanced over several servers
All code that is to be released on any of those servers must go via version control, unit testing and acceptans testing. All releases must be signed by develop, sys-admin and test executive.
This means that I am not allowed/it's hard to add "new code" (even if it's tested and verified) via Google Website Optimizer or any other of the GUI-paste-your-new-variation-here type of solutions.
We can however on server side decide which users gets which variation. Basically we can push the new version on X of the servers making 10-30% of the users view it for their entire session. The question is: Which tools do we use to measure "success" (i.e improved conversion rate). My idea so far has been:
Tag the new version in Google Analytics using a session variable (and then make reports based on segment) (similar to what is described on http://searchengineland.com/how-to-analyze-ab-tests-using-google-analytics-67404 )
Use Optimizely which has API support:
window.optimizely = window.optimizely || [];
window.optimizely.push(['bucketUser', EXPERIMENT_ID, VARIATION_ID])
What solutions have you tried for locked-down environments? Am I missing some obvious solution?
The site is in .NET/Episerver on IIS.
Regards,
Niklas
You could use the AB-testing capability built into EPiServer CMO.
We ended up going with Google Analytics and adding a session variable such as "abtest" with value "variation-4" and publishing it on certain nodes. It worked fairly well, with some limitations, namely that google analytics funnels doesn't have segment support.
We did something similar and we have found Google Analytics documentation confusing. In the end the following code (made by server) got the work done for us:
<script>
window.dataLayer = window.dataLayer || [];
function gtag(){dataLayer.push(arguments);}
gtag('js', new Date());
gtag('config', 'UA-xxxx-xxxx', {
'custom_map': {'dimension1': 'abTestDesign'}
});
gtag('event', 'abTestDesign_dimension', {'abTestDesign': 0, 'non_interaction': true});
</script>
This code is generated by the server, where the last JS line is either that one of
gtag('event', 'abTestDesign_dimension', {'abTestDesign': 1, 'non_interaction': true});
Seems to be working fairly well on Numbeo.com