Best way to serve uploaded multimedia content - file

I am currently developing a website that allows uploading of images and other files. At this point I have a couple of doubts.
The first is, where should those files be stored? I was thinking of storing them in the file system.
And the second, what is the most correct way to serve uploaded multimedia files? If you use the file system as a storage medium, should you use static routes configured on the server or is there another better alternative?

Where should the files be stored?
If the numbers of files are limited and not very much, then you can store it in the filesystem. However, if the number is large, I would recommend to store it in some storage service like AWS S3. Process could be either of two that you store temporarily the uploaded file in filesystem and then upload it in S3 or direct upload to S3. Depends on the use case.
What is the most correct way to serve uploaded multimedia files?
In case you are using services like S3, you just have to set content type and expiration among other metadata while uploading and rest everything will be taken care by S3 only. In case you are storing data on filesystem, then you can use nginx/apache to serve the static assets files along with proper content type and other metadata.

Related

What is the best way to store data from my repository?

I'm building a website with a "repository" where the user can download and upload several files of different sizes. My idea is to use my database with metadata to store information (extension, some tags and the path where it is stored). And then the user can access this data or search the application. My problem is that I don't know the best way to store it, I thought of using google drive, S3 and dropbox through API's. But I would like to know other ways to make this possible. I know it's a complex subject, but I would like you to tell me where to study.
for storing metadata you can use mongodb sync with file id and for storing large file you can use google drive API (let me know if you intrested ) or some kind of file hosting.

How to store the files uploaded from a website or a html?

Now I am creating a real digital library website. The website is required to store uploaded file in a database and receive it from the database. I have found a way to store the uploaded files in Google Cloud. May I know is there another way better than this?

Creating and serving temporary HTML files in Azure

We have an application that we would like to migrate to Azure for scale. There is one place that concerns me before starting however:
We have a web page that the user is directed to. The code behind on the page goes out to the database and generates an HTML report. The new HTML document is placed in a temporary file along with a bunch of charts and other images. The user is then redirected to this new page.
In Azure, we can never be sure that the user is going to be directed to the same machine for multiple reasons: the Azure load balancer may push the user out to a different machine based on capacity, or the machine may be deprovisioned because of a problem, or whatever.
Because these are only temporary files that get created and deleted very frequently I would optimally like to just point my application's temp directory to some kind of shared drive that all the web roles have read/write access to, and then be able to map a URL to this shared drive. Is that possible? or is this going to be more complicated than I would like?
I can still have every instance write to its own local temp directory as well. It only takes a second or two to feed them so I'm ok with taking the risk of whether that instance goes down during that microsecond. The question in this regard is whether the redirect to the temp HTML file is going to use http 1.1 and maintain the connection to that specific instance.
thanks,
jasen
There are 2 things you might want to look at:
Use Windows Azure Web Sites which supports some kind of distributed filesystem (based on blob storage). So files you store "locally" in your Windows Azure Web Site will be available from each server hosting that Web Site (if you use multiple instances).
Serve the files from Blob Storage. So instead of saving the HTML files locally on each instance (or trying to make users stick to a specific instance), simply store them in Blob Storage and redirect your use there.
Good stuff from #Sandrino. A few more ideas:
Store the resulting html in in-role cache (which can be collocated in your web role instances), and serve the html from cache (shared across all instances)
Take advantage of CDN. You can map a "CDN" folder to the actual edge-cache. So you generate the html in code once, and then it's cached until TTL expiry, when you must generate the content again.
I think azure blob is best place to to store your html files which can be accessed by multiple instances. You can redirect user to that blob content or you can write custom page to render content from blob.

Where to put shared files on amazon AWS design for failure architecture?

I have a simple LAMP website on AWS. I want at least 2 instances, that will connect to RDS, and I'll have a load balancer.
The simple question, not answered on the web is: Where do you put the shared user uploaded files like the images?
NFS should be one answer. I mean,something like create another instance, sharing a folder through NFS and the other instances mount them. But it is a failure point.
Rsync folders between instances ¿?!!
The best answer I found is to use s3fs, and mount an s3 bucket on all my instances. But I don't like to use things not supported by Amazon.
What is the right solution?
Thanks!
SOLUTION
I tried s3fs but I experienced many issues (ie with Eventual Consistency).
So, finally I implemented the AWS SDK and used the S3 library to upload directly to S3.
Then pointed a domain to the s3 bucket directly.
Thanks to all
Why not put the files in S3? You don't need to use something like s3fs - just use the regular s3 apis to upload the files to s3.
Users can also upload the files straight to s3 without going via your server if use the post api
Like Frederick said, S3 is probably best, because the only other option is to put them on the load balancer and return them right away. But then if you do multiple load balancers with some kind of Round Robin DNS or something, you are up the creek, and have to copy them all over hell, and it just wont work. And if you are using the ELB, then you can't do that anyway.
As you say, you already are looking at s3fs. You are right - why have an extra layer of software.
Likely better is to generate time limited, signed urls - or if the files are truly public - just make that part of the bucket public, and directly use an s3 url to serve the files. You can also point a DNS at S3, so http://my-bucket.s3.amazonaws.com/path/to/file.png becomes files.mycompany.com/path/tofile.png.

Store Binary files on GAE/J + Google DataStore

I'm building application on Google AppEngine with Java (GAE/J) and all my data will be stored in Google DataStore. Now, what if I want to save some binary file, let's say Images (JPG, PNG, etc), DOC, TXT, Video Files how do i deal with these? Or what if i want to stream video files (SWF) where and how should i store those files and when I redeploy my app i don't loose any data.
Depends on whether you're talking about static files or dynamic... If they're static created by you, you can upload them subject to a 10MB/3000 file max but Google doesn't offer a CDN or anything.
If they're dynamic, uploaded by your users or created by your application, the datastore supports BlobProperties: you can dump any kind of binary data you want in there as long as it's less than 1MB per entity. If they're larger you can consider another service like S3 or Mosso's cloud files. This can be a better solution for serving files directly to users because these guys can offer CDN service but it's not cheap. On the other hand your latency back to GAE will be much higher than storing the data in Google's Datastore and you'll have to pay for transit on both sides so it's something to take into account if you're going to be processing the files on App Engine.
Google App Engine Virtual File System (GaeVFS)

Resources