Search K
Appearance
Appearance
Other ways to support HackTricks:
By default when a file is uploaded to PHP (even if it isn't expecting it), it will generate a temporary file in /tmp
with a name such as php[a-zA-Z0-9]{6}
, although I have seen some docker images where the generated files don't contain digits.
In a local file inclusion, if you manage to include that uploaded file, you will get RCE.
Note that by default PHP only allows to upload 20 files in a single request (set in /etc/php/<version>/apache2/php.ini
):
; Maximum number of files that can be uploaded via a single request
max_file_uploads = 20
Also, the number of potential filenames are 62*62*62*62*62*62 = 56800235584
Other techniques relies in attacking PHP protocols (you won't be able if you only control the last part of the path), disclosing the path of the file, abusing expected files, or making PHP suffer a segmentation fault so uploaded temporary files aren't deleted.
This technique is very similar to the last one but without needed to find a zero day.
In this technique we only need to control a relative path. If we manage to upload files and make the LFI never end, we will have "enough time" to brute-force uploaded files and find any of the ones uploaded.
Pros of this technique:
The main problems of this technique are:
So, how can you make a PHP include never end? Just by including the file /sys/kernel/security/apparmor/revision
(not available in Docker containers unfortunately...).
Try it just calling:
php -a # open php cli
include("/sys/kernel/security/apparmor/revision");
By default, Apache support 150 concurrent connections, following https://ubiq.co/tech-blog/increase-max-connections-apache/ it's possible to upgrade this number up to 8000. Follow this to use PHP with that module: https://www.digitalocean.com/community/tutorials/how-to-configure-apache-http-with-mpm-event-and-php-fpm-on-ubuntu-18-04.
By default, (as I can see in my tests), a PHP process can last eternally.
Let's do some maths:
โ ๏ธ
Note that in the previous example we are completely DoSing other clients!
If the Apache server is improved and we could abuse 4000 connections (half way to the max number). We could create 3999*20 = 79980
files and the number would be reduced to around 19.7h or 6.9h (10h, 3.5h 50% chance).
If instead of using the regular php mod for apache to run PHP scripts the web page is using PHP-FMP (this improves the efficiency of the web page, so it's common to find it), there is something else that can be done to improve the technique.
PHP-FMP allow to configure the parameter request_terminate_timeout
in /etc/php/<php-version>/fpm/pool.d/www.conf
.
This parameter indicates the maximum amount of seconds when request to PHP must terminate (infinite by default, but 30s if the param is uncommented). When a request is being processed by PHP the indicated number of seconds, it's killed. This means, that if the request was uploading temporary files, because the php processing was stopped, those files aren't going to be deleted. Therefore, if you can make a request last that time, you can generate thousands of temporary files that won't be deleted, which will speed up the process of finding them and reduces the probability of a DoS to the platform by consuming all connections.
So, to avoid DoS lets suppose that an attacker will be using only 100 connections at the same time and php max processing time by php-fmp (request_terminate_timeout
) is 30s. Therefore, the number of temp files that can be generated by second is 100*20/30 = 66.67
.
Then, to generate 10000 files an attacker would need: 10000/66.67 = 150s
(to generate 100000 files the time would be 25min).
Then, the attacker could use those 100 connections to perform a search brute-force. **** Supposing a speed of 300 req/s the time needed to exploit this is the following:
Yes, it's possible to generate 100000 temporary files in an EC2 medium size instance:
โ ๏ธ
Note that in order to trigger the timeout it would be enough to include the vulnerable LFI page, so it enters in an eternal include loop.
It looks like by default Nginx supports 512 parallel connections at the same time (and this number can be improved).
Other ways to support HackTricks: