Data compression libraries: brotli vs zlib - zlib

Which data compression works best for compressing javascript files when downloaded from any website: brotli or zlib ?

I'll assume that by "javascript files" you mean javascript source code.
brotli (content encoding "br") is better for non-dynamic content, where you expect it to be compressed once, but transmitted and decompressed many times. That would normally be the case for javascript. The average gain is about 20% for javascript code.
Not all clients accept brotli (so far Firefox, Chrome, and Edge do). If the client doesn't accept it, then the encoding negotiation will automatically fall back to content encoding gzip (what zlib produces).

Related

Snowflake PUT command, AUTO_COMPRESS vs gzip compressed file performance

Can someone suggest which of below option will be more performant with PUT command:
Uploading file with AUTO_COMPRESS=true.
Uploading compressed file(gzip) AUTO_COMPRESS=false.
There's no harm to leaving AUTO_COMPRESS=true because if a file is already compressed, the PUT command won't try to double compress it. There is an important caveat to note though. If a file is already compressed, it must be compressed in a supported compression method. You can get a list of supported methods here: https://docs.snowflake.com/en/sql-reference/sql/put.html
Using compression either before or auto_compress is advisable since it will reduce network transfer times and bandwidth consumption. This will use CPU and IO on the server doing the PUT operation. If the server doing the PUT is maxed out (I've seen some cases of VMs on oversubscribed systems for example), it would be better to perform the compression before sending to the machine doing the PUT. This is because there's already a lot of CPU and IO on the PUT operation because it's encrypting the files prior to upload.

.bin files used for upgrading embeded devices

I am confused a bit about .bin files. Basically in Linux we use elf, .ko type of files for upgrading the box or to copy in it . But, while upgrading a NAND flash in router or any Networking Gaint products why always .bin files is preferred. Is this is something like converged mix of all the OS related files. Does it possible to see the contents of a bin file. How to play with it. It is something like contents of BootROM. How is is prepared? How do we create and test on this. How Linux support for this. Any historical reasons behind this?
Speaking about routers, those files are usually just snapshots of a router's flash memory, probably compressed and with some headers added. Typical things are a compressed squashfs image or simply gzip'ed snapshot of memory.
There is no such thing as .bin format, it's just a custom array of bytes and every vendor interprets it in some vendor-specific way. Basically this extension means “it's not your business what's in the file, our device/software will handle it”. You can try to identify (thnk, reverse-engineer) what's actually in those files by using file utility or just looking at those files through a hex editor and trying to guess what's going on.

Strange .XAP.PNG archives (bing maps)

When I was looking at the www.bing.com/maps html source file I came across some
strange "streetsidePrefetchFileUrls" URLs to .xap.png archives:
For exmaple:
hxxp://c0.ecn.catalogservice.virtualearth.net/cs/dc/pf/Xaps/bcbc3954e568c46cf8c3cc00737da32c_Microsoft.Maps.Framework.xap.png
This file has a PNG Header and contains an IDAT chunk. The IDAT chunk doesn't contain pixel data but a (corrupt) PKZIP archive with Microsoft Maps DLLs inside.
Does anybody know why Microsoft uses PNG as a container for xap Archives?
I've tried to unpack that archive with unzip and 7z. They all detect the ZIP-Archive inside the PNG but abort unpacking with an error.
Yes I noticed this they changed to this kind of protection just before the map app api came out of Beta. I presume it is their way of providing some protection for xap code from 3rd party developers, we have tried lots of ways to decrypt these without any luck but the code for it must be in the initial silverlight code so with a lot of decompiling you could probably find it
Many of the Bing Maps App's available on http://bing.com/maps contain proprietary algorithms and code. The png encoding is there to help prevent people from decrypting the XAP files and decompiling them. That said, if there is something you are trying to figure out how to do then try asking that question instead.

Transfer large files using HTTP/POST

How can I upload (very) large file with HTTP protocol in C (or C++)?
I know it's not the right way to upload huge files, but that's not the point.
I've already seen sources about POST transfers of files in C++ but I noticed that, each time, the WHOLE binary file was included inside the POST sequence (between "--boundary").
So, before going deeper and deeper into Webkit/Gecko source code, does somebody know how do they do?
You may use "Content-Range" header in HTTP POST request, to upload a slice of a huge file.
POST http://[server_ip]:80/upload?fn=xx.dat HTTP/1.1
Content-Range: bytes=0-4095
Content-Type: application/octet-stream
Content-Length: 4096
......
You can use chunking method to send the data in chunks. This will allow you to send/receive data on a larger scale. Refer to RFC 2616 on how chunking really works.

App Engine: Is it possible to disable Transfer-Encoding: Chunked for large static files?

As a follow-up to this question, is it possible to disable the "Transfer-Encoding: Chunked" method for large static files, therefore forcing a Content-Length to be returned instead?
My site serves a few Flash files. The small ones (500-700kb) report a Content-Length fine, but the large one (approx 3MB) doesn't, instead using chunked mode.
Although the file downloads fine, the Flash preloader doesn't work, because it can't tell how long the file is, and therefore what percentage is loaded.
Is my only option to write a dynamic handler to serve the static file?
Thanks.
Transfer-Encoding is in the list of Disallowed HTTP Response Headers (modifying them has no effect). source

Resources