Hey I am sheeraz
I am CEO at hacking laymen
I am a network penetration tester
I am a web developer and designer
BCA II year student at BSSS
“Secrets have a cost, they are not for free.”
― the amazing spiderman
What's Spidering?
Web spiders are the most powerful and useful tools developed for both good and bad intentions on the internet. A spider serves one major function, spider (like Google) works is by crawling a web site one page at a time, gathering and storing the relevant information such as email addresses, meta-tags, hidden form data, URL information, links, etc. The spider then crawls all the links in that page, collecting relevant information in each following page, and so on. Before you know it, the spider has crawled thousands of links and pages gathering bits of information and storing it into a database. This web of paths is where the term 'spider' is derived from.
How Spidering Works
Tools For Spidering
Where Tools Spidering Fail! :(
What a typical spider looks like
User Directed Spidering
This is a more sophisticated and controlled technique that is usually preferable to automated spidering. Here, the user walks through the application in the normal way using a standard browser, attempting to navigate through all the application’s functionality. As he does so, the resulting traffic is passed through a tool combining an intercepting proxy and spider, which monitors all requests and responses. The tool builds a map of the application, incorporating all the URLs visited by the browser.
Where the application uses unusual or complex mechanisms for navigation,
the user can follow these using a browser in the normal way. Any functions
and content accessed by the user are processed by the proxy/spider tool.
The user controls all data submitted to the application and can ensure that data validation requirements are met.
The user can log in to the application in the usual way and ensure that the authenticated session remains active throughout the mapping process. If any action performed results in session termination, the user can log in again and continue browsing.
Any dangerous functionality, such as deleteUser.jsp, is fully enumerated
and incorporated into the proxy’s site map, because links to it will be
parsed out of the application’s responses. But the user can use discretion
in deciding which functions to actually request or carry out.
Backup copies of live files. In the case of dynamic pages, their file extension may have changed to one that is not mapped as executable, enabling you to review the page source for vulnerabilities that can then be exploited
on the live page.
Backup archives that contain a full snapshot of files within (or indeed
outside) the web root, possibly enabling you to easily identify all content
and functionality within the application.
New functionality that has been deployed to the server for testing but not
yet linked from the main application.
Default application functionality in an off-the-shelf application that has
been superficially hidden from the user but is still present on the server.
Old versions of files that have not been removed from the server. In the
case of dynamic pages, these may contain vulnerabilities that have been
fixed in the current version but that can still be exploited in the old version.
configuration and include files containing sensitive data such as database
credentials.
Source files from which the live application’s functionality has been
compiled.
Comments in source code that in extreme cases may contain information
such as usernames and passwords but that more likely provide information
about the state of the application. Key phrases such as “test this function”
or something similar are strong indicators of where to start hunting for
vulnerabilities.
Log files that may contain sensitive information such as valid usernames,
session tokens, URLs visited, and actions performed.
http://eis/auth/AddPassword
http://eis/auth/ForgotPassword
http://eis/auth/GetPassword
http://eis/auth/ResetPassword
http://eis/auth/RetrievePassword
http://eis/auth/UpdatePassword
Vulnerabilities may exist at the web server layer that enables you to discover content and functionality that are not linked within the web application itself. For example, bugs within web server software can allow an attacker to list the
contents of directories or obtain the raw source for dynamic server-executable pages.
Furthermore, many web applications incorporate common third-party components for standard functionality, such as shopping carts, discussion forums, or content management system (CMS) functions. These are often installed to a fixed location relative to the web root or to the application’s starting directory.
DEBUG=TRUE
The majority of ways in which the application captures user input for serverside
processing should be obvious when reviewing the HTTP requests that are
generated as you walk through the application’s functionality.
http://eis/shop/browse/electronics/iPhone3G/
“No matter how buried it gets, or lost you feel, you must promise me, that you will hold on to hope and keep it alive. We have to be greater than what we suffer. My wish for you is to become hope. People need that.”
― Peter Parker Spiderman
Thank You ― Sheeraz Ali