I downloaded these databases for US and CA from GeoNames. The date looks like this:
5881639 100 Mile House 100 Mile House 51.64982 -121.28594 P PPL CA 02 0 917 America/Vancouver 2006-01-18
5881640 101 Mile Lake 101 Mile Lake 51.66652 -121.30264 H LK CA 02 0 917 America/Vancouver 2006-01-18
5881641 101 Ponds 101 Ponds 47.811 -53.97733 H PNDS CA 05 0 18 America/St_Johns 2006-01-18
I want to use this data for a city-picker, but I want to display to province or state beside it. Doesn't look like this data contains that information. Is there some way to retrieve that? Or is there a better DB that includes that?
You use the data in the columns for the admin codes these are actually ids that link to the admin codes table (there are separate data sets available for the admin codes) it is very straightforward.
Check the Geonames forums for more info.
http://forum.geonames.org/
Use the datasets here: geocoder.ca which include city name and state / province name in the same file.
If you want to stick with your data, you can use Google's Geocoding API, as in the first answer here:
Google Maps: how to get country, state/province/region, city given a lat/long value?
to get information based on latitude and longitude. This will be a lot of work, though, especially for a city-picker.
Related
We have a data dilemma involving our products and how they're setup in our ERP system.
When a product is revised, we end up creating a new part number and change the name slightly.
Here is how it shows up
Name QtySold
Product 1 10
Product 1 V2 08
Product 1 V3 06
Product 1 V3 With Accessories 12
All of these product names vary in length, otherwise I'd use the LEFT() function to group them together.
How can I write a query that will take anything that is obviously related (by the naked eye) and group and summarize them together?
The ideal output would be something like this:
Name QtySold
Product 1 36
I want to show the results of a MySQL query on my website using angularjs. For now, I'm showing them using a simple table with ng-repeat and it works with no problem. But because the data is a lot, I wanted to ask if it is possible to create multiple panels or tables per specific field.
To be more specific, I have 4 fields returned from the query: name, address, occupation, department. Right now I have a table such as:
George Smith Nikis 10 Project Manager Finance
Maria Bexley Lincoln 20 Project Manager Research
Chris Liggs Forks 123 Programmer Computer Science
etc. I want to know if I can create as many panels or tables as the unique values of the "occupation" field are and then show the results per that unique value inside each panel/table. So instead of the above table I would have something like:
Project Manager
George Smith Nikis 10 Finance
Maria Bexley Lincoln 20 Research
Programmer
Chris Liggs Forks 123 Computer Science
I think you need to use groupby filter
Check this fiddle by Darryl Rubarth, it contains the answer you need
http://jsfiddle.net/drubarth/R8YZh/
<div ng-repeat="item in MyList | orderBy:'groupfield' | groupBy:'groupfield'" >
You can use group by filter in ng-repeat with which you need to group
Date State | City | Zip | Water | Weight
-------------------------------------------------------------------
01/01/2016 Arizona Chandler 1011 10 ltr 40 kg
01/04/2016 Arizona Mesa 1012 20 ltr 50 kg
06/05/2015 Washington Spokane 1013 30 ltrs 44 kg
06/08/2015 Washington Spokane 1013 30 ltrs 44 kg
What I want are complex queries, like I want to know average water, weight by passing a city or state or ip for a date range or month, or any field or all fields.
I am not sure how to go about this. Read about map reduce, but cant guess how will I get above output
If you have link for examples which covers above scenarios that will also help.
Thanks in advance
So first we need to model your structured data in JSON. Something like this would work:
{
"date": "2016-01-01",
"location": "Arizona Chandler",
"pressure": 1101,
"water": 10,
"weight": 40
}
Here's your data in a Cloudant database:
https://reader.cloudant.com/so37613808/_all_docs?include_docs=true
Next we'd need to create a MapReduce view to aggregate the a specific field by date. A map function to create an index whose key is the date and whose value is the water would look like this:
function(doc) {
emit(doc.date, doc.water);
}
Every key/value pair emitted from the map function is added to an index which can be queried later in its entirety or by a range of keys (keys which in this case represent a date).
And if an average is required we would use the built-in _stats reducer. The Map and Reduce portions are expressed in a Design Document like this one: https://reader.cloudant.com/so37613808/_design/aggregate
The subsequent index allows us to get an aggregate across the whole data set with:
https://reader.cloudant.com/so37613808/_design/aggregate/_view/waterbydate
Dividing the sum by the count gives us an average.
We can use the same index to provide data grouped by keys too:
https://reader.cloudant.com/so37613808/_design/aggregate/_view/waterbydate?group=true
Or we can select a portion of the data by supplying startkey and endkey parameters:
https://reader.cloudant.com/so37613808/_design/aggregate/_view/waterbydate?startkey=%222016-01-01%22&endkey=%222016-06-03%22
See https://docs.cloudant.com/creating_views.html for more details.
How to get a Carrier Name in Orders API of Amazon MWS for MFN Account which is already shipped? is it possible to do this using Order API?
I don't think there currently is any API to retrieve tracking information (or even just the carrier name) through MWS.
For the sake of completeness: To submit shipping information including carrier names and tracking numbers, you can use the SubmitFeed API with FeedType=_POST_ORDER_FULFILLMENT_DATA_. The corresponding XSD (OrderFulfillment.xsd) defines the following values as valid CarrierCodes: USPS, UPS, FedEx, DHL, Fastway, GLS, GO!, Hermes Logistik Gruppe, Royal Mail, Parcelforce, City Link, TNT, Target, SagawaExpress, NipponExpress, YamatoTransport . All other carriers must use the CarrierName field.
While I'm sure it's available in MWS, it would be much easier to use the carriers shipping number structure to determine the carrier.
IE...
Fedex can have 12 or 15 digit tracking numbers and the barcode can be 22 digits.
UPS has 1Z in front of their tracking.
USPS format is 20 digits (e.g. 9999 9999 9999 9999 9999), or a combination of 13 alphabetic and numeric characters, usually starting with 2 alphabets, following by 9 digits, and ending by "US" (e.g. EA 999 999 999 US
build regular expression to handle these easily.
I need to scrape Form 10-K reports (i.e. annual reports of US companies) from SEC website for a project.
The trouble is, companies do not use the exact same format for filing this data. So for ex., real estate data for 2 different companies could be displayed as below
1st company
Property name State City Ownership Year Occupancy Total Area
------------- ----- ------ --------- ---- --------- ----------
ABC Mall TX Dallas Fee 2007 97% 1,347,377
XYZ Plaza CA Ontario Fee 2008 85% 2,252,117
2nd company
Property % Ownership %Occupany Rent Square Feet
--------------- ----------- --------- ----- -----------
New York City
ABC Plaza 100.0% 89.0% 38.07 2,249,000
123 Stores 100.0% 50.0% 18.00 1,547,000
Washington DC Office
12th street .......
2001, J Drive .......
etc.
Likewise, the data layout could be entirely different for other companies.
I would like to know if there are better ways to scrape this type of heterogenous data other than writing complex regex searches.
I have the liberty to use Java, Perl, Python or Groovy for this work.
I'd be inclined to keep a library of meta files that describe the layout for each page you want to scrape data from and use it when trying to get the data.
In that way you don't need complex reg-ex commands and if a site changes its design you simply change a single one of your files.
How you decide to create the meta file is up to you but things like pertinent class names or tags might be a good start.
then describe how to extract the data from that tag.
Unsure if there is a tool out there that does all that.
The other, nicer, way might be to contact the owners of these sites and see if they provide a feed in the form of a WebService or something that you can use to get the data. Saves a lot of heartache I should think.