Geographic Map Hitcounter - maps

I am looking for a stand alone solution similar to clustermaps, etc.
I want to be able to plot visits to my page via geographic location much like the map overview
statistics on google analytic's.
My application all ready makes use of the MaxMid free geolite database, but I have been unable to find examples of using the maxmind results to plot co-ordinates on a map.
Php is not my strong point either so Ideally I am looking for example code or a pre made solution.
The stuff I have found so far, like cluster maps etc all require you to link back to their own service, I need a 100% stand alone solution.
Does anybody have some example code using maxmid to plot visits to a graph or something similar?
Thanks in advcance.

The exact solution will depend on the projection[1] of the map you choose. Mercator vs Robinson will require different formulas. Mercator is incredibly common, so a good place to start. Googling around there's a bunch of posts about turning lat & long into x & y for exactly this about the web.
The easiest method is probably some sort of embed, you'll save yourself a lot of math as well as potentially buying a map. Google Maps lets you embed it into your own site, but still add points as required.
[1] Projection - The earth is roundish, maps are flat. The various projections are different attempts to represent this round earth on a flat surface.

Related

How do apps like Mapbox, AllTrails and Maps.me use and display ALL of OSM data? When all the resources say that's a huge amount of data

I started exploring Overpass Turbo and Mapbox with hopes of building my travel app. I can query some data in OT and get towns or islands, no problem, I understand the whole process of querying and exporting as Geojson.
But for learning purposes, I always do queries within a small area so I don't get too much data back.
Also, various resources mention that OSM data for the whole planet is huge, like here: https://wiki.openstreetmap.org/wiki/Downloading_data it says: The entire planet is a huge amount of data. Start with a regional extract to make sure your setup works properly. Common tools like Osmosis or various import tools for database imports and converters take hours or days to import data, depending largely on disk speed.
But when I go to apps like AllTrails, Maps.me or Mapbox, they seem to be showing a huge amount of data, definitely the main POIs.
here's an example screenshot from All Trails
Can someone briefly explain how is this done then? Do they actually download all of data? Or little by little depending on the current bounding box. Any info I can research further, I'd appreciate it!
Thanks
P.S. I am hoping to build my app with Node.js, if that makes a difference.
Several reasons:
They don't always display everything. You will always only see a limited region, never the whole world in full detail. If you zoom in, you will see a smaller region but with more details. If you zoom out, you will see a larger region but with reduced details (less or no POIs, smaller roads and waterways disappear etc.).
They don't contain all the available data. OSM data is very diverse. OSM contains roads, buildings, landuse, addresses, POI information and much more. For each of the mentioned elements, there is additional information available. Roads for instance have maxspeed information, lane count, surface information, whether they are lit and if they have sidewalks or cycleways etc. Buildings may have information about the number of building levels, the building color, roof shape and color and so on. Not all of these information are required for the apps you listed and thus can be removed from the data.
They perform simplifications. It isn't always necessary to show roads, buildings, waterways and landuse in full detail. Instead, special algorithms reduce the polygon count so that the data becomes smaller while keeping sufficient details. This is often coupled with the zoom level, i.e. roads and lakes will become less detailed if zoomed out.
They never ship the whole world offline. Depending on the app, the map is either online or offline available, or both. If online, the server has to store the huge amount of data, not the client device. If offline, the map is split into smaller regions that can be handled by the client. This usually means that a certain map only covers a state, a certain region or sometimes a city but rarely a whole country except for smaller countries. If you want to store whole countries offline you will need a significant amount of data storage.
They never access OSM directly. All apps and websites that display OSM maps don't obtain this information live from OSM. Instead, they either already have a local database containing the required data. This database is periodically updated from the main OSM database via planet dumps. Or they use a third-party map provider (such as MapBox from your screenshot) to display a base map with layers on top. In this case they don't have to store much information on their server, just the things they want to show on top of OSM.
None of the above is specifically for OSM. You will find similar mechanisms in other map apps and for other map sources.

What is the best approach to build a Google App Engine store locator type of application?

I am trying to build a store locator type of application that will automatically display the nearby stores on Google Map as markers. Because the limitation of GQL, traditional way of doing geo query is not possible. I came across three options and want to ask if anyone had any experience with them and which one works better. Thanks!
Geomodel
Mutiny project
Geodatastore
I don't have any experience doing geo queries, but it is always helpful to use the activity meters that Google Code has.
At a glance it looks like the first one is your best bet.
The Search API now supports Geosearch.
It can retrieve results within a given radius and sort them by distance, so it should work for what you want to do.

Finding lat/lng for based on postalcode

I have a dataset of thousands of full addresses of business (specifically in the netherlands, but I guess the question can apply everywhere).
I want to find the lat/lng so I can do distance calculation, but because of the size of the dataset I'm worried it's not a wise idea to do this using for example google maps.
Is there a webservice I could query to find all this info?
The Google Geocoder web service is available for this:
http://code.google.com/apis/maps/documentation/geocoding/index.html
It's free (unless you abuse it, or volumes get too big), and returns JSON or XML.
I've been using Google but it misses many (Scandinavian) addresses which are caught by Yahoo. See http://developer.yahoo.com/maps/rest/V1/geocode.html and at least compare the two for your needs. If I were you I would have every miss returned by Google to be geocoded by Yahoo as fallback (or the other way around.)
Accurate postcode information is owned by someone in most jurisdictions and they charge for supplying the lat/lng information. In the UK it is the Post Office, I don't know about the Netherlands, but this looks quite promising. Even Google's geocoder is not that accurate outside the US.
One thing I should mention is that the lat/lng will not be sufficient for you to calculate distances (unless you are going everywhere by crow). One of the real advantages of Google's service is that GDirections uses knowledge of the road system and estimates journey time. If you are solving some sort of travelling salesman problem, lat/lng alone is not going to give you a very good estimate of actual distance, especially in cities.
HTH
Not sure of the quality/accuracy of the geocode but this could be an option, http://www.opengeocoding.org/geocoding/geocod.html

Where can I find a city/neighborhood database?

Where can I find a database of cities and neighborhoods using MySQL? I'm only interested in US areas. Price doesn't matter.
The database must help identify locations by ZIP code. I've already got a database showing cities and states, but I need to find surrounding neighborhoods as well.
I saw good example on http://www.oodle.com/.
The Zillow Neighborhood data has a CC-sharealike license and it is pretty comprehensive. It is widely used in the Geospatial world nowadays.
Cheers
For a fee... you can subscribe to Maponics' Neighborhood dataset
While Maponics provides mostly GIS data, (eg. allowing one to pinpoint on a map the boundaries of neighborhoods and such), the simple neighborhood list is also available, I think.
Another commercial offering is Urban Mapping's
In you target particular cities/counties, there are plenty of free resources to be found, oft' in the .gov / .us sites, for specific cities and counties. Unfortunately aside from the difficulty of locating such resources (there doesn't seem to exist any practical directory for such local gov-managed databases), there is no standard as to the format in which the data is stored or the specific semantics of the data collected. Luckily, ZIP-code is rather unanbiguous, and he neighborhood concept relatively general (even though the neighborhoods themselves can be quite dynamic, with bot the introduction of new neighborhood names, and some minor shifting of boundaries).
The overall complexity of the task of compiling such databases, the long half-life of the data, and the potentially lucrative uses of such data, seem to explain why it is hard to find non-commercial sources.
This is an old question - but there is a far better and EASIER way of doing it as of June 2015:
http://maps.googleapis.com/maps/api/geocode/json?address=YOUR_ADDRESS&sensor=false
Example:
http://maps.googleapis.com/maps/api/geocode/json?address=11%20W%2053rd%20St%20New%20York&sensor=false
Here's a great site offering free databases for both cities and countries:
http://ipinfodb.com/ip_database.php
Yelp has a neighborhood API.
http://www.yelp.com/developers/documentation/technical_overview
It might be worth checking out some of the links in this article. There are several where you might find the data you're after.
Infochimps has the Zillow Neighborhoods API:
http://www.infochimps.com/datasets/zillow-neighborhoods
Maponics has over 150,000 neighborhoods worldwide available in MySQL and other formats, as well as an API.
Urban Mapping has an API to find neighborhoods by address, City/State, and as you need in your case, Zip Code (called the getNeighborhoodsByPostalCode method).
Here is a link to their demo apps which show how it works:
URBANWARE API Demo Applications
Edit:
Urban Mapping doesn't exist anymore, and the Demo link has linkrot; here's what it did look like, via Wayback Machine
[
While this isn't a database per se, you could quickly populate your own database by calling their API for every Zip code you'd be interested in seeing.
Note that this is part of their Premium API. If you have the long/lat coordinates of each city, you can use their free API to get a list of neighborhoods whose boundaries contain the long/lat coordinates.

Extracting surveillance camera positions from streetview images

Related to my previous question, is there some realistic chance to extract surveillance camera positions out of google streetview pictures by means of computer vision algorithms? I'm no expert in that area. But it should be easier than face detection and the like.
I think you're wrong about it being an easier problem than face recognition (though I suspect you mean face detection).
Consider that faces are of a reasonably regular shape, generally have 2 eyes a nose and a mouth in a specific configuration whilst surveillance cameras from one manufacturer will look different from those of another and look different from different angles.
With faces, if you can't see the person's face you're not interested in it, but in your scenario you're interested in detecting the camera regardless of it's relative position to you.
Whilst not impossible (i.e. humans can do it!) I don't think computer science is quite up to the task just yet.....
This sounds like the class of problem for which Amazon's Mechanical Turk was invented. I don't believe that an image processing or image recognition algorithm is within our current understanding and hardware/software capabilities.
I definitely agree with Rob that extracting the camera locations is going to be more difficult than face detection (or even recognition).
How about a different tack on your question: how to find the location of the camera taking surveillance images.
There are standard (if complicated) photogrammetry techniques to map 2D or 3D coordinates of objects using photographs from multiple cameras or a single camera at multiple angles. What you're looking for would be "reverse photogrammetry" which I haven't seen before, but this interesting legal anecdote suggests it's feasible.

Resources