Unprotected Apache server status pages put popular websites at risk

Potentially sensitive information is exposed that hackers can use to better plan their attacks against websites, researchers say

By Lucian Constantin, IDG News Service |  Security, Apache, Servers

This happens because requests from users are received by the Squid proxy first and then passed to the Web server, causing the server to see the requests as coming from the proxy's local IP address.

When running in this configuration, Squid stores static versions of pages generated by the Web server for a limited period of time and serves them to users, which prevents the server from overloading when dealing with a lot of traffic. Without caching services like Squid, the Apache server would have to regenerate PHP or other dynamic pages for every visitor, which can quickly consume the machine's available resources during a traffic spike.

Running Squid and Apache on the same machine is particularly popular with owners of smaller websites, who can't afford running these services on separate machines, Povey said. "Many, myself included, just have a rented box in a data center somewhere."

Larger companies might run dedicated Web caching servers. However, even in those cases, if the Apache access control rules for /server-status allow access to the whole IP range of the internal network, which includes the caching servers, the same problem would occur.

"You'd have to take extra steps to ensure you didn't expose this," Povey said. "Server-status et al [other server info pages like the one generated by mod_info] is something that is easy to overlook."

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

ITworld Answers helps you solve problems and share expertise. Ask a question or take a crack at answering the new questions below.

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness