To hack a website in most cases admin page is more important.I will show you a simple trick to find admin pages
I will use a simple loop hole in robots.txt
Do you know about robots.txt file ?
Yes it is file to spiders of search engines to decide what to crawl and what not to crawl. The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website
Yes lets start:
First select a website eg:www.google.com
Then after the address add /robots.txt to website adress
Now ur Url will be www.google.com/robots.txt
Now you will get a txt file showing all pages which is disallowed by admin to search engine
If the site admin is a good Administrator he will not allow admin page to be crawled by search engine
Now seach for admin you will find admin page
See bellow
Note: Above the admin page as marked as Arrow Mark
No comments:
Post a Comment