How to remove the default upload capacity limit of 40MB in joomla? - joomla3.0

I am a new to joomla, I use xampp to install joomla, because the default upload file size of joomla is 40MB, if the file is too large, how to solve it? Is there a way to remove the 40MB default upload capacity of Joomla?
The picture is as follows:
enter image description here

Joomla does not restrict file upload size for extension installation. You need to increase the upload_max_filesize and post_max_size PHP variables.
You can do this via cPanel -> Select PHP Version -> Options if you are using cPanel or you may be able to edit the php.ini file in the root folder of your website e.g. /public_html/php.ini as follows:
upload_max_filesize = 128M
post_max_size = 128M

Related

How to upload folders to Google Colab?

I want to run a notebook that uses many header files defined in the directory. So basically I want to upload the entire directory to Google Colab so that I can run the notebook. But I am unable to find any such options and only able to upload files not complete folders. So can someone tell me how to upload entire directory to google colab?
I suggest you not to upload them just in Colab, since when you're restarting the runtime you will lose them (just need to re-upload them, but it can be an issue with slow connections).
I suggest you to use the google.colab package to manage files and folders in Colab. Just upload everything you need to your google drive, then import:
from google.colab import drive
drive.mount('/content/gdrive')
In this way, you just need to login to your google account through google authentication API, and you can use files/folders as if they were uploaded on Colab.
EDIT May 2022:
As pointed out in the comments, using Google Drive as storage for a large number of files to train a model is painfully slow, as described here: Google Colab is very slow compared to my PC. The better solution in this case is to zip the files, upload them to colab and then unzip them using
!unzip file.zip
More unzip options here: https://linux.die.net/man/1/unzip
You can zip them, upload, then unzip it.
!unzip file.zip
The easiest way to do this, if the folder/file is on your local drive:
Compress the folder into a ZIP file.
Upload the zipped file into colab using the upload button in the File section. Yes, there is a File section, see the left side of the colab screen.
Use this line of code to extract the file. Note: The file path is from colab's File section.
from zipfile import ZipFile
file_name = file_path
with ZipFile(file_name, 'r') as zip:
zip.extractall()
print('Done')
Click Refresh in the colab File section.
Access the files in your folder through the file paths
Downside: The files will be deleted after the runtime is over.
You can use some part of these steps if your file is on a Google Drive, just upload the zipped file to colab from Google Drive.
you can create a git repository and push the files and folders to it,
and then can clone the repository in colaboratory with the command
!git clone https://github.com/{username}/{projectname}.git
i feel this method is faster.
but if the file size is more than 100 mb you will have to zip the file or will have to add extentions to push it to github.
for more information refer the link below.
https://help.github.com/en/github/managing-large-files/configuring-git-large-file-storage
The best way to approach this problem is simple yet tricky sometimes.
You first need to compress the folder into a zipped file and upload the same into your google drive.
While doing so, Make sure that the folder is in the root directory of the drive and not in any other subfolder!. If the compressed folder/data is in other subfolder, you can easily move the same into the root directory.
Compresses folder/data in another subfolder often messes with the unzipping process when you will be specifying the file location.
Once you did the afore mentioned tasks, enter the following commands in the colab to mount your drive:
from google.colab import drive
drive.mount('/content/gdrive')
This will ask for an access token that can be generated by clicking on the url displayed in the output of the same cell
!ls gdrive/MyDrive
Check the contents of the drive by executing the above command and ensure that your folder/data is displayed in the output.
!unzip gdrive/MyDrive/<File_name_without_space>.zip
eg:
!unzip gdrive/MyDrive/data_folder.zip
Executing the same will start unzipping your folder into the memory.
Congrats! You have successfully uploaded your folder/data into the colab.
zip your files zip -r file.zip your_folder and then:
from google.colab import files
from zipfile import ZipFile
with ZipFile(files.upload(), 'r') as zip:
zip.extractall()
print('Done')
So here's what you can do:
-upload the dataset desired folder to your drive
-over colab, mount the drive wherein this
"from google.colab import drive
drive.mount('/content/gdrive')"
automatically shows up and you just need to run it
-then check for your file over the Files section on the left-hand side (if folder not visible try refreshing, also there should be a drop-down arrow next to it where you can check all the files under the folder )
-left-click over the folder wherein you get a COPY PATH option
-paste the copied path over the desired location in your colab

Spacy on AppEngine standard

I'm trying to use Spacy on the new AppEngine Standard Python 3.7 runtime.
When I try to deploy I get:
ERROR: (gcloud.app.deploy) Cannot upload file
[/my/project/path/venv/lib/python3.7/site-packages/spacy/lang/tr/lemmatizer.py],
which has size [41523943] (greater than maximum allowed size of
[33554432]). Please delete the file or add to the skip_files entry in
your application .yaml file and try again.
A few oddities:
The docs seem to indicate that I don't need to upload the virtual environment and it will be created from requirements.txt
Looking at the log file, it seems to ignore .pyc files, but not the venv directory
The error message says to add to the skip_files in your application .yaml file and try again., but the docs say the python3.7 runtime doesn't use skip files and to use a .gcloudignore file instead, but adding venv/ or venv/* doesn't work (it appears to be ignored)
To fix this, I needed up update gcloud and reauthenticate:
gcloud components update
gcloud auth login

Openshift: view file system

i'm new to Openshift.
Here my problem:
i've deployed a war, using git and it works fine. I have a servlet that upload images, and everything is okay.
I don't understand, where should I go to see the structure file system.
For example, in Tomcat in eclipse I can see the uploaded file in the file system.
Is there the chance in openshift to see my file system and so my uploaded file?
And more, if I deploy a war, may I have the chance to modify it with some kind of console in openshift?
Thanks for answering.
The right way to do this is using rhc client.
So these are the steps to follow:
1) install ruby-installer
2) install git
3) install rhc
4) type rhc setup.
Then you will be asked to input your credentials. Once logged in, you will obtain an OAuth token, and a new public key is uploaded to openshift.
Then type:
rhc -a app_nome and you will have the app view.
Then, typing ls, you will see the entire file system.

upload_max_filesize set to 4M in phpinfo(), can't change it!

I've got this in my php.ini
upload_max_filesize = 25M
post_max_size = 25M
and I've got this in my .htaccess
php_value upload_max_filesize 26214400
php_value post_max_size 26214400
as shorthand can only be used in the php.ini (http://www.php.net/manual/en/faq.using.php#faq.using.shorthandbytes)
but no matter what I do, when i call phpinfo() I get
Directive Local Value Master Value
upload_max_filesize 4M 25M
I've looked at all other php.ini files, my .htaccess, ini_set(). Anything I could think of and nothing will change it from 4M. Any help would be great!
EDIT: restarting apache didn't work, i've check my httpd.conf, it seems like an external file or something. any other places to check?
That doesn't look like proper syntax for a php.ini file.
upload_max_filesize = 25M
post_max_size = 25M
Then, make sure to restart Apache.
If you still have troubles, refer to:
http://www.cyberciti.biz/faq/linux-unix-apache-increase-php-upload-limit/
I had same issue, but restarting php-fpm helped.
So it didn't help to restart just apache2, but this command:
sudo systemctl restart php7.3-fpm
Cleared probably some cache and took the new modified php.ini.
If you're using OSX then make sure your php.ini file is not the default one in /etc (php.ini.default) with the command php --ini,
output sample :
Configuration File (php.ini) Path: /etc
Loaded Configuration File: /etc/php.ini
Scan for additional .ini files in: /Library/Server/Web/Config/php
Additional .ini files parsed: (none)
in case the first line: Configuration File (php.ini) Path with empty parameter make a copy of the default php ini and the changes will take affect.
also, you'll have to check the memory_limit directive.
If it's lower than upload_max_filesize or post_max_size then their value will be limited to the memory_limit value.
Anyway, post_max_size should be slightly larger than upload_max_filesize (any posted values other than the file itself would need some memory as well)
On modern systems, memory_limit should be at least 64M, 128M or more is recommended
If you are running PHP Fast cgi then you need to restart the process. I believe that restarting apache should do the trick.
Problem was it was actually being set in another .ini file that didn't appear at first glance to be related to anything!
Note to self: always use "grep" to search within files when things are fishy!
If you are running in a local server, such as wamp or xampp, make sure it's using the php.ini you think it is. These servers usually default to a php.ini that's not in your html docs folder.
If you are using WAMP then make changes in proper php.ini which apache uses, normally its found in C:\wamp\bin\apache\\bin.
make changes in 2 parameters post_max_size and upload_max_filesize (found in line 734 and 886, usually), to 50M (to upload files upto 50 MB).
don't forget to restart apache.
I could solve this problem via a workaround. I created .user.ini file in the root document of my webapp and overwrite the values.
I was facing the same issue.
Check Loaded Configuration File path in php info.
Then open that file and change the variables you want and restart your apache server.
Make sure that you're editing the php.ini in the conf/"server_version" folder
Make sure that your computer is not hiding a .txt on the end of the php.ini file.
Use terminal or right click / get info or properties.

xap file download location

When I connect to a web-site that is using Silverlight, my understanding is that the "XAP" file is downloaded to C:\Users\ "UserName" \AppData\Local\Temp folder (under windows Vista).
There are few sites that I know that are using "XAP" files but I don't see a "XAP" file in this folder. Any ideas?
I think I found the answer. It just gets downloaded to the logged-in user's temporary internet files folder.

Resources