Full Path Disclosure

Severity: Medium
Test name: Full Path Disclosure

Every attack begins with thoroughly preparing and analyzing the target server and its software. Attackers deploy various tactics to induce crashes or erratic behavior in the victim's application. Consequently, error messages generated during these incidents can inadvertently reveal the full file path where the error occurred. Moreover, attackers can utilize search engines, combining specific keywords with the victim's domain name, to discover errors and gather further intelligence on potential vulnerabilities.


This vulnerability allows an attacker to get the following information:

  • Operating system type
  • Web root folder
  • Folder structure
  • Used third-party libraries and their versions
  • Server software and the software version

Example 1: Changing the expected scalar type to array

  • Expected request: https://your_web-site.com?page=about
  • Actual request: https://your_web-site.com?page[]=about

    An attacker can also change the cookie value for your_web-site.com

Example 2:
"Googling" errors on your_web-site.com

Search requests:

  • "mysql_connect" site:your_web-site.com
  • "failed to open stream" site:your_web-site.com
  • "headers already sent" site:your_web-site.com

With detailed information about errors/warnings, an attacker can find vulnerabilities in any component of the victim’s application. In combination with other vulnerabilities (for example, SQL Injection, File Inclusion), it's possible to steal configuration files or even replace them (for example, to use another database).

  • The issue can be found in the source code on the server side.
  • The issue can be found in the server configuration.
Remedy suggestions
  • Disable showing errors to users. The errors should be logged to files or a special logging system. Instead of general error information, a special HTML page can be shown.
<customErrors defaultRedirect="GenericError.htm" mode="On">
  <error statusCode="500" redirect="InternalError.htm"/>
  • Avoid showing users debug information or sensitive logs. Consider using special log management software to standardize log information.
  • Deny access for search engines to the URLs that should not be indexed (for example, API endpoints). Here is an example of the "robots.txt" file (blocks all bots for all pages):
User-agent: *
Disallow: /
  • Minimize the usage of third-party libraries. Keep the latest versions of the used third-party libraries and system components.
  • CWE-200
  • CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N