Web Sources and Support Files

Additional data may be present in web documents (JS, CSS, HTML).

  • Left by developers to help testing, debugging and development.

  • This information may provide too much information about system internals.

  • Sometimes developers “hide it” by including this information in /robots.txt.

    • Robots.txt works for search engine crawlers, but attracts attackers to sensitive areas.

Impact:

  • Allow fingerprinting remote stack.

  • Disclose sensitive information.

Typical example:

  • Backup files (.bck, .tar.gz, .zip).

  • Robots.txt

  • README and License files.

  • Log files left available.

  • Additional folders

Last updated