Export Thunderbird adaptive junkmail filter data - spam-prevention

Is there really no way to export/import the adaptive junk filter training between computers?
I just setup thunderbird on a new computer and I feel like the junk filtering is sorely lacking behind my other workstation where the adaptive junk filter has had quite some time to train. I can't find anything on it in the options or while googling.
Thanks in advance
Some links:
How the junk filtering works (bayesian filtering)
Junk Mail Controls feature description (old as mold)

You should be able to transfer this data by copying the file training.dat from your old profile to your new one.
Be sure to close both instances and backup your Profile folders before copying files from one profile to another.
You can also see Mozilla Support on how to find your Profile folder and Mozilla Zine for a full list of files in your Profile folder.

Related

Database with Image display

Ok, this one is annoying me. The company network is restricted to the max, we can't run exe's only set programs via AD.
I cant use powerapps as the folders I need to access are internal. Can't use php/mysql or asp/sql as admins will not allow me space or even look at setting up a dev site.
I looked at Excel, but have a few GB's of images, that'll almost kill any excel file.
What options if any are out there that will allow me to create a db of all my images and make a searchable user interface that can be shared out to users?
Thnaks in advance

Best way to transfer database to new website without access to previous one

I has old website, which I miss dashbored login info, so I didn't has access to its database
Now I implement new one, what the best way to extract previous data (mp3 files, articals, mp4 files) to use it in new one.
I hope there is a way to extract its not manually
Thanks
I'd suggest in future all extraneous matter, images, mp3s etc. be held in a separate folder on your server, and hyperlinked to where they appear on a page.
That folder can be downloaded for safe-keeping.
.
In this case, I can only suggest that if the website is still up ( since you've simply lost the keys ), as an ordinary user you could download the whole website [ which wouldn't include the database ] with something like wget or Httrack to your computer, and from that extract what items are possible to your computer [ filtering by extensions if need be ].
At the very least you should get an idea of which items were present, even if they don't download.
As for articles, either the website online, or the downloaded website should display them in a browser, and you could copy the text.
None of this applies if the website is no longer online; but I wish you well whatever the outcome.

GreatMaps (GMap.net) Offline Files...To Download?

I'm trying to create a map viewer for an existing C# WinForms application. I've installed and hooked up the GreatMaps (GMap.net) controls no problem and that's all working fine. The quirk in this is that it needs to work offline (as it's an application that gets used by users who aren't always in locations where there's a mobile signal).
The offline cache mode works fine for GMap but, we've got to build the cache beforehand which we can do but, as we need the whole of the UK, will be a pain (especially as we have to go down to street level).
Does anybody know if there any existing cache files that we can download and use? I've looked at downloading OSM files but haven't got the faintest idea how to use them, convert them (into the gmdb format that GMap seems to use).
Any ideas?
Check out VectorTileRenderer (https://github.com/AliFlux/VectorTileRenderer) which has a demo for Gmap.net. All you need is an MBTiles file which you can download for free at OpenMapTiles.org (https://openmaptiles.com/downloads/europe/british-isles/)

How do I make uploaded content uploaded from many servers

I have this kind of problem. I have created a web application and I am going to run it on a dedicated server. Users will be able to upload photos and other kind of files. If the users increase I add another server.
So they become two like in the picture below. Now since the files are being uploaded to my applications root folder, I think the new server won't be able to read those files. How can I accomplish to store files in a way that whatever server a user will be connecting to he will be able to retrieve the file. How are the cheaper computers (small ones in the ring) connected so that they store files like one big drive with one giant folder such that whenever I want to increase storage I just add another cheap computer to the ring. What do I need to search for in the web?
please pardon me for my poor English. I had asked a similar question before but nobody answered so I thought the photo might help. I am willing to learn anything new to solve this problem. my other earlier question

Large File Advice

I have to deliver 2-ish GB media files to customers (zipped up) after purchase. Any advice on how to deliver such big files to the general population (translated: novice internet users who will not be savvy enough to use FTP or something).
We can build a download manager for Windows users, but I doubt we'll be able to get one for Mac/Linux. Is there a standard solution I don't know about?
Thanks!
For most users on a high speed internet connection, novice or not, a direct HTTP download link is likely sufficient. Just be sure that your HTTP responses for both HEAD and GET return the Content-Length header so that users get an accurate progress bar for their download.
In my opinion, the only other reasonable option for novices is probably a download manager. You could of course build your own (possibly using a product like Real Basic to quickly code for all 3 platforms)
There are a number of companies out there that have off the shelf "download assistants" as well. May want to take a look at what companies like Adobe are using for their software downloads.
EDIT: Turns out Adobe uses a custom AIR application for their "download assistant" which is a cross platform option as well.
I'd say have them as a .torrent file. That way people can continue where they left off, and don't have to start over. You can divide the file into a bunch of rar's or .r01-.r20 and it'll help with distribution. THe bottom line is you don't want people to keep having to start over, this can be frustrating. With a .torrent is viable, especially if you don't want to use FTP.
Windows doesn't have a built in .torrent handler, but I'm sure Linux does. OS X I'm not sure about either.

Resources