I work for a small publishing company with an internal website that displays a static HTML table of our published products.
We have a need to be able to list and sort published products (about 1-2 items are published per day) dynamically that is being fed from an Excel spreadsheet. The Excel spreadsheet is what we are currently using to maintain the data. The Excel spreadsheet is on a shared network drive available to the company.
I am familiar with AngularJS, ReactJS, and VueJS2 for front-end development and was wondering if I would be able to use one of those tools to consume a Excel file, parse it to JSON, and then display it dynamically on the client side.
Is something like this is possible?
When a user finishes editing the Excel sheet and saves it to the shared network drive, is there a script that would automatically save the data as JSON? I assume we would then simply have our Javascript framework reference and consume the saved JSON to populate its published products list.
Note: We are unable to use a relational database at this time (ie MySQL).
Part 1 - generating json from excel...
front-end technologies are not the way to go. You need to run a service that watches folder for change (like nodejs or python). Saving as csv instead of xls might make things easier as you may not need extra libraries to make sense of your xls file
Part 2, displaying json data...
Your browser, by default, cannot load a local json file. So you may need to run a server (again nodejs and python make this relatively easy) to host your json file.
there are many ways of presenting data these days, but without knowing some of your particular and based on the information you did share, looks like you've got a steep learning curve to get something like this going.
Related
I have a python script that fetches data twice a day from a server of mine. The script returns around 40 JSON files containing various data. The files aren't particularly big and the combined size of all the files is around 250KB.
Alongside my script I am developing a dashboard in React that renders the data from each file into a table, allowing me a visual representation of the data.
I have been looking at what would be the best way to store these files, something that allows me to upload and fetch them twice a day.
Someone mentioned to me about using MongoDB to store the files, but after some research I feel like Mongo is better at storing the contents of the file rather than the file itself. I tried to develop a solution but I couldn't figure out how it could be done when each object is stored as a document with no clear way (to me) which document came from which file.
Other options I have considered are:
Storing the files on the server that is hosting my React project and rendering them locally as I am doing now during development
Storing the files using a provider such as AWS/Firebase
Storing them in a different database (I see SQL now support the storing of JSON files)
Are there any other solutions that you think would work best for this scenario? If so, why?
Hello,
Check about use of FTP server.
We have clients that send us data every 10 min via FTP that is inside XML files, then I have NodeJS back-end which read these files.
You can use it for your scenario with JSON files.
My basic workflow is this: I check an FTP server for a specific file. If the file exist, I pick up the file and sends it to a Blob Storage. My problem is this: I want to filter the file content, eg. remove first and last row since they dont contain any real data before I send it to the blob. The first row consist of a time stamp and the last row contains a "row count". The file contains comma separated fields. How do I accomplish this? Is it even possible?
Thanks
Ausgar
There is no simple solution for this problem. You can try converting csv to json, delete unnecessary data from it, and create a blob based on that json, but this is sounds harder than it should be.
Consider using Azure functions:
Azure Functions allows you to run small pieces of code (called
"functions") without worrying about application infrastructure. With
Azure Functions, the cloud infrastructure provides all the up-to-date
servers you need to keep your application running at scale.
It will be much easier to do such file manipulation there.
I am a complete newbie when it comes to web page design, and what I am trying to achieve is a web page that I can display on a wall mounted screen as an office dashboard. I have data in excel that is constantly being updated (on the server) and I want to be able summarise this and display it (e.g. total orders etc.) for staff to see. Therefore the web page needs to be able to connect to the data source, and update itself every few minutes. I am hoping to then use Ubuntu or even Raspberry Pi to drive the dashboards.
Can anybody point me towards either some clear instruction on how to achieve this, or better still some sample files that will help me see how its done?
Really appreciate any help!!
If you want to use PHP, you can use PHPExcel, to read your Excel files, and if you want to only display the information as is, you may output it to HTML directly, there is an API to manipulate the Excel file, so you may use that to summarize your data, however, if you need something cleaner, you may want to use Windows instead of Linux, given that in Windows, you can use an Excel file as a data source, and there are third party products that can use them as if it were a database (using SQL queries to retrieve data).
I have a project as follows:
User uploads Excel file to server, server will return back with 2 new columns. User wants us to check prices being charged and we have file that holds average standard pricing.
In the desktop application just done, I use Microsoft.Office.Interop.Excel
for manipulating the Excel file.
But this is not available in Silverlight. Reading is not the issue.
The issue is adding 2 new columns. Program reads excel file using oledb, and oledb is very light and is available in web.
But for creating 2 new columns, I use Microsoft.Office.Interop.Excel that Microsoft provides.
This is not available in web.
I will be need to check how can we do this.
One possibility is to have the program on the server, waiting for a file, process the file, and email back to the user.
I just want to see if there is another way. I don't like this approach it doesn't seem best.
You have a few options for doing this with Silverlight. First, you can use the Excel XML format for the files which means adding a column is just an XML exercise. Second, if that doesn't work, you can upload the file to the server and run the same code you have in your desktop app to update the file. Once it is updated you can prompt the user to save the file back to their hard drive.
If you go the Excel XML route then you would need to create a web service to get the price data from your database out to the Silverlight on the client. Oledb won't work since you don't want to expose your database via oledb on the Internet.
I am interested in creating a video databse. My goal is to have a folder where my videos will be kept and each time I copy/delete a video the website that presents them should be updated to. the problem is I have no idea how to approach it.
Should I..
Use Sql and store a reference to each video location?
Have a script that checks all the time if new changes happen in that folder?
A package like joomla?
I am using ubuntu btw. I already have a simple html5 page, and I am presenting the videos using html5 video.
It depends on the size and the performance you want.
1.Way : use php to scan the folder and generate links on the fly
2.way : Use a database to store the file names and retrieve the names from the database and generate urls
pros and cons.
simple to implement , no changes in upload or download script. no database required.
You need have a database , little coding required for upload and also while genrating a page
You should make a db (format does not matter) and storing in it only file names of videos: the videos would be stored on hard drive.
Any operation on the web site will pass first on db for insert/update/delete videos records and then (maybe in a transaction context) on the file system.
This would be the standard approach to your question.