client wants a "disaster" weather tracker on website.. possible? - weather

Hey guys, I have a client who wants their front site to have a tracker on disaster weather like hurricanes when they occur. Essentially they want a way to manage this without getting a developer to do it everytime a new storm develops.
I know there are plenty of ways to get a weather updates on website when you have a set location.. thats easy. Its different when you dont know the storms in advance! Location would be different every time.
Does anyone have any idea on how to do this?

I would probably monitor an RSS feed.
TWC has some nice RSS feeds setup that look like they will facilitate your needs: http://www.weather.com/weather/rss/subscription/
The National Hurricane Center also has RSS feeds: http://www.nhc.noaa.gov/aboutrss.shtml

Related

Fetch live sports data that changes from website to Google sheets

Update:
I have found a better website for getting the data. If you press my link, you can see in the top of the screen there are multiple future games. I would like to get details about the games that BC Copenhagen plays. The details should be live stats (points, players' points...). Is this possible?
Website: https://basketligaen.dk/
I have a pretty difficult question.
I want to fetch live data from a basketball game website. However, this is very tricky, because first of all:
The live games change all the time, and I'm only interested when one team is playing: "BC Copenhagen". (However, if I get more data from different teams, that would work as well)
When you click on the list, you get moved to a detailed screen about the live game. And this is the data I'm interested in.
So I need to somehow get data from the live games. Is it even possible to do this with Excell commands, or do I need to work with APIs like Zapier or Make? However, I have no idea how to do this.
Do you maybe have to give the Link to the game every time the game starts? But this would really not be a reliable solution.
Note: I'm using this data for a Glide App
Website: https://official.mvpapp.dk/live.php
You could try IMPORTHTML, but the link provided above shows no data for me. Most likely this requires much more scripting than that. Ideally, there would be an API data feed that could be used. Again, I see no data, so I can't say for sure.
A worst case scenario would be using #google-apps-scripts to parse the listing of games, grab the game you want, and get the data from that page. That would be out of scope of a Stack Overflow question.

How to add first name and email before uploading a video?

Hi guys im brand new and not a developer but I need a way for users when they go to my site they can upload there video and there would be a option for them to add there first name and email so when the video is uploaded the database can keep all the data together.
Ideally I want this as easy as possible for the user and this would just go to our youtube channel or any video platform will work.Any advice would be great!
Please provide more information like what platform are you using ?.
There's more than one way to skin a cat.
The simple way to achieve with web technologies like (Php,node,jave) is maintain the basic user information into the sessions, and whenever it's necessary use this information.
You need to get some knowledge about the system you are using. You particularly need:
access to the server
to know the server type
access to the database
to know the database type
where the relevant files are
After you have gathered all these information, you at least know what you do not know. The next step is to gather information about how you can implement the feature you need. Look at it like at a puzzle with many small pieces. If you are patient-enough, at the end you will resolve the puzzle.

Do you have to host your data with MapQuest?

From what I've read so far, it seems like the only way for me to map custom data points from my own dataset is to host that data with MapQuest. Am I correct in that or have I just not read deep enough?
And if it's possible, does anyone have a link to more information about how to go about it? Their API documentation is subpar.
Thanks :)
Disclaimer: I work at MapQuest
While the MapQuest Data Manager makes it easy to store custom data with MapQuest so that you can query it through the Search API, you don't have to store data with us in order to show custom points on a map.
Are you trying to do something along the lines of storing data in MySQL or PostgreSQL and then use something like PHP to query your own database, loop through the results, and then show the results on a MapQuest map using the JavaScript API? Unfortunately I don't have any easy/quick examples that show how to do that, but it is possible.
The forums on the Developer Network are also good place to look to see if others have had issues similar to the one that you are facing.
Also, let me know exactly which MapQuest APIs/tools you are using and I will do my best to provide more information depending on what you need.

Need ideas on retrieving data from a website

I'm stumped and need some ideas on how to do this or even whether it can be done at all.
I have a client who would like to build a website tailored to English-speaking travelers in a specific country (Thailand, in this case). The different modes of transportation (bus & train) have good web sites for providing their respective information. And both are very static in terms of the data they present (the schedules rarely change). Here's one of the sites I would need to get info from: train schedules The client wants to provide users the ability to search for a beginning and end location and determine, using the external website's information, how they can best get there, being provided a route with schedule times for the different modes of chosen transport.
Now, in my limited experience, I would think the way to do that would be to retrieve the original schedule info from the external site's server (via API or some other means) and retain the info in a database, which can be queried as needed. Our first thought was to contact the respective authorities to determine how/if this can be done, but this has proven to be problematic due to the language barrier, mainly.
My client suggested what is basically "screen scraping", but that sounds like it would be complicated at best, downloading the web page(s) and filtering through the HTML for relevant/necessary data to put into the database. My worry is that the info on these mainly static sites is so static, that the data isn't even kept in a database to build the page and the web page itself is updated (hard-coded) when something changes.
I could really use some help and suggestions here. Thanks!
Screen scraping is always problematic IMO as you are at the mercy of the person who wrote the page. If the content is static, then I think it would be easier to copy the data manually to your database. If you wanted to keep up to date with changes, you could then snapshot the page when you transcribe the info and run a job to periodically check whether the page has changed from the snapshot. When it does, it sends an email for you to update it.
The above method could also be used in conjunction with some sort of screen scaper which could fall back to a manual process if the page changes too drastically.
Ultimately, it is a case of how much effort (cost) is your client willing to bear for accuracy
I have done this for the following site: http://www.buscatchers.com/ so it's definitely more than doable! A key feature of a web scraping solution for travel sites is that it must send you emails if anything went wrong during the scraping process. On the site, I use a two day window so that I have two days to fix the code if the design changes. Only once or twice have I had to change my code, and it's very easy to do.
As for some examples. There is some simplified source code here: http://www.buscatchers.com/about/guide. The full source code for the project is here: https://github.com/nicodjimenez/bus_catchers. This should give you some ideas on how to get started.
I can tell that the data is dynamic, it's to well structured. It's not hard for someone who is familiar with xpath to scrape this site.

Where to get an updated list of video games?

I am currently designing a reviews site for video games similar to gamespot am wondering where and if there is an online database that contains information such as name, publisher, release date etc with an API. I dont really want to have to enter each title manually or let users enter the title manually.
Where do these large sites get information like this? I wouldn't think it would be manually. I know for movies IMDB exists.
How would I go about adding it to my database?
Thanks
May I point you to web scraping?
Be sure to read the section legal issues and on well-behaved bots.
There's always Amazon and their product advertising API. Some older, but interesting code snippets can be found on this page.
If you know Perl, there is an amzing module called WWW::Mechanize
Pretty much you can write a script to get to any website and grab any data you need.
So for example you can go to www.gamespot.com, get list like the one below and put them in your database.
http://www.gamespot.com/games.html?platform=1029&mode=all&sort=views&dlx_type=all&sortdir=asc&official=all&tag=games%3Bfooter%3Bmore

Resources