These docs are for v1.1. Click to read the latest docs for v1.2.

Full Path Disclosure

Severity: Medium
Test name: Full Path Disclosure

All attacks get started with preparation and analysis of a victim’s server and used software. Attackers use different techniques to lead a victim’s application to crash or incorrect behavior. Due to that, the shown error message may contain information about a full path to the file where the error happened. In addition, different search engines can be used to find errors by special keywords with a combination of the victim’s domain name.


This vulnerability allows an attacker to get the following information:

  • Operating system type
  • Web root folder
  • Folder structure
  • Used third party libraries and their versions
  • Server software and the software version

Example 1: Changing the expected scalar type to array

  • Expected request:
  • Actual request:[]=about

    An attacker can also change the cookie value for

Example 2:
"Googling" errors on

Search requests:

  • "mysql_connect"
  • "failed to open stream"
  • "headers already sent"

With the detailed information about errors/warnings, an attacker is able to find vulnerabilities for any component of the victim’s application. In combination with other vulnerabilities (for example, SQL Injection, File Inclusion), it's possible to steal configuration files or even replace them (for example, to use another database).

  • The issue can be found in the source code on the server side.
  • The issue can be found in the server configuration.
Remedy suggestions
  • Disable showing errors to users. The errors should be logged to files or a special logging system. Instead of general error information, a special HTML page can be shown.
<customErrors defaultRedirect="GenericError.htm" mode="On">
  <error statusCode="500" redirect="InternalError.htm"/>
  • Avoid showing debug information / sensitive logs to users. Consider using a special log management software to standardize logs information.
  • Deny access for search engines to the URLs which should not be indexed (for example, API endpoints). Here is an example of the "robots.txt" file (blocks all bots for all pages):
User-agent: *
Disallow: /
  • Minimize the usage of third-party libraries. Keep the latest versions of the used third-party libraries and system components.
  • CWE-200
  • CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N