Ethical Hacking Week 3: Utilizing Search Engines

Search engines are a very innocent and useful tool that has emerged when the web hit the world by storm. However did you know that these search engines such as Google, Bing, DuckDuckGo, and many more can be used for revealing sensitive information? First lets analyze how these search engines work. SERP otherwise known as Search Engine Results Page prompt you with the most organically relevant results based on your search query not including Paid Advertising. These result pages are determined by Crawling, Indexing, and Ranking however we will only cover Crawling.

Crawling- is the discovery process when the search engine sends out a team of robots known as crawlers to find new updated content. Content can vary such as a web page, an image, a video, pdf, etc. – content is discovered by links. The bots start out by fetching webpages and then follow links to those webpages to find new URLS. By link hopping the crawler is able to find new content and adds it to the index called Caffeine. Robots.txt files are located in the root directory of websites and suggest which parts of your site search engine should and shouldn’t crawl. However google allows us to modify this crawling process with advanced operators. The usage of these operators along with the google search engine is referred to as Google Dorking.

Google Dorking–  is a hacker technique that uses Google Search and other Google applications to find security holes in the configuration and computer code that websites are using. Google hacking involves using advanced operators in the Google search engine to locate specific errors of text within search results. Here are some examples of Google Dorking our professor had us do for exercise.

Gives sites for admins to login to
Gives WordPress uploads of the Website

Summary- as you can see google dorking and utilizing search engines can be very rewarding in finding information on the target and even finding points of interests for which routes to attack. We can go a step further in utilizing search engines with Shodan.

Shodan -is basically a search engine that crawls the Internet while Google and Bing crawl the World Wide Web. In essence Shodan gives you information about devices that are connected to the internet , These devices can vary tremendously such as small desktops, computer labs, etc. Shodan collects information from banners so it banner grabs the metadata about a software that’s running on a device. This can be server software information, services capability, etc. Using Shodan can reveal servers, ports, location, services, and even vulnerabilities.

Ethical Hacking Week 2: Target Scoping

Target Scoping – a process for gathering target assessment, requirements, and characterizing each of it’s parameters to generate a test plan, limitations, business objectives, and time schedule. To give an example what the end result of Target Scoping would look like here are what the parameters are.

-Company Name
-Address
-Website
-E-mails and Phone Numbers
-Penetration Testing Objectives and Penetration Testing Type
-Devices to be Tested: Servers, Workstations, Network Devices, etc.
-Operating Systems Supported

Target Scoping can be done with enumeration tools that are pre-installed on our Kali Linux machines. Such tools are

-whois command: provides when the website was created, the expiration date of the website, status of the website, the name of the servers, potential location of the company, phone number and email of Sponsoring Registrar.

-nslookup command: provides the potential IP Address and how many web servers are accepting requests

-dig command: provides about the same thing as nslookup but it doesn’t hurt to try

-whatweb command: provides Country, HTTP Server, IP Address, Web Servers, Technologies Used, Potential Operating System.

-theHarvester command: this tool crawls a search engine with your target in mind and provides Emails and Subdomains.

All these tools and strategy of Target Scoping can give us insight of how the network topology of the target is arranged. But what is Network Topology??


Network Topology
arrangement of the links, nodes, etc. of a communication network. The network topology is the structure of a network and may be depicted physically or logically. The physical topology is the placement of the various components of a network. Logical topology illustrates how data flows within a network. There is also something called and OSI Model that divides part of a networks communication functions into layers.

OSI Model – this is a conceptual model that characterizes and standardizes the communication functions of a telecommunication or computing system without regard to its underlying structure and technology. The goal of the OSI model is interoperability. This model partitions the flow of data in a communication system into seven abstraction layers. This model represents the physical implementation of transmitting bits across a communications medium to the highest level representation of a distributed application. Each intermediate layer serves a class of functionality to the layer above it and served by the layer below it