# CEH Notes
<h1>FootPrint</h1>
<h2>
Objectives</h2>
To build a hacking strategy, attackers need to gather information about the target organization's network. They then use such information to locate the easiest way to break through the organization's security perimeter. Footprinting methodology makes it easy to gather information about the target organization; this plays a vital role in the hacking process.
<u>It helps to</u>:
- <b>Know Security Posture</b>: Performing footprinting on the target organization gives the complete profile of the organization's security posture. Hackers can then analyze the report to identify loopholes in the security posture of the target oganization and buld a hacking plan accordingly.
- <b>Reduce Focus Area</b>: By using a combination of tools and techniques, attackers can take an unknown entity (for example XYZ organizatio) and reduce it to a specific range of domain names, network blocks and inividual IP addresses ofsystems directly connected to the internet, as webb, as many other detauls pertaining to its security posture.
- <b>Identify vulnerabilities</b>: A detailed footprint provides maximum information about the target organization. It allows the attacker to identify vulnerabilitues in the target systems to select appropiate exploits. Attackers can build their own information database about security weaknesses of the target organization. Such database can then help in the weakest link in the organization's security perimeter.
- <b>Draw Network Map</b>: Combining footprinting techniques with tools such as Tracert allows the attacker to create diagrammatic representations of the target organization's network presence. Specially, it allows attackers to fraw a map or outline of the target organization's network infrastructure to know about the actual environment that they are going to break on. A network map will depict the attacker's understanding of the target 's internet footprint. These network diagrams can guide the attacker in performing an attack.
<h3>
Footprinting Threats</h3>
<b><i>Social Engineering</i></b>: Without using any intrusion methods, hackers directly and indirectly collect information through persuasion and other means. Hackers gather crucial infromation from willing employees who re unaware of the hackers' intent.
<b><i>System and Networks Attacks</i></b>: Footprinting enables an attacker to perfrom system and network attacks. Thus, attackers can gather information related to the target organization's system configuration, the operating sytem running on the machine, and so on. Using this information, attackers can find vulnerabilities. They can then take control of a targetsystem or th entire network.
<b><i>Information Leakage</i></b>: Information leakage poses a threat to any organization. If sensitive information of an entity falls into the hands of attackers, they can mount an attack based on the information or alternatively use it for monetary benefit.
<b><i>Privacy Loss</i></b>: Through footprinting, hackers can access the systems and networks of the organziation and even escalate the privileges up to admin levels, resulting in the loss of privacy for the organization as a whole and for its individual personeel.
<b><i>Corporate Espionage</i></b>: Corporate espionage is a central threat to organizations, as competitors often aim to attept to secure sensitive data through footprinting. Through this approach, competitors can launch similar products in this marjet, alter prices, and generally undermine the market position of a target organziation.
<b><i>Business Loss</i></b>: Footprinting can have a major effect on organizations such as online businesses and other e-commerce websites as well as banking and fiance-related buiness. Billions of dollars are lost every year u to malicious attacks by hackers.
<h3>
Passive Footprint</h3>
Involves gathering information using external resources.
- <i>Finding information through search engines.</i>
- <i>Finding Top-Level Domains (TLDs) and subdomains of a target through web services.</i>
- <i>Collection location information on the target through web services.</i>
- <i>Performing people search using social networking sites and people search services.</i>
- <i>Gathering Financial information about the target through financial services.</i>
- <i>Gathering infrastructure details of the target organization through job sites.</i>
- <i>Collecting information through deep and dark web footprinting.</i>
- <i>Determining the operating systems in use by the target organization.</i>
- <i>Performing competitive intelligence.</i>
- <i>Monitoring the target using alert services.</i>
- <i>Gathering information using groups, forums, blogs, and NNTP Usenet newsgroup.</i>
- <i>Collecting information through social engineering in social networking sites.</i>
- <i>Extracting information about the target using Internet archives.</i>
- <i>Gathering information using bussiness profile sites.</i>
- <i>Monitoring website traffil of the target.</i>
- <i>Tracking the online reputaion of the target.</i>
<h3>
Active Footprint</h3>
- <i>Querying published name servers of the target</i>
- <i>Searching for digital files</i>
- <i>Extracting website links and gathering wordlists from the target website</i>
- <i>Extracting metadata of published documents and files</i>
- <i>Gathering website information using web spidering and mirroring tools</i>
- <i>Gathering information through email tracking</i>
- <i>Harversting email lists</i>
- <i>Performing Whois lookup</i>
- <i>Extracting DNS information</i>
- <i>Performing traceroute analysis</i>
- <i>Performing social engineering</i>
<h3>
Information obtained in Footprinting</h3>
The major objectives of footprinting include collecting the network information, system information, and organizational information of the target. Information such as network blocks, specific IP addresses, employee detils and so on.
- <u>Organization</u>: Such information about an orgainization is available from its website. In addition, you can query the target's domain name against the Whois database and obtain valuable information.
- Employee details (employee names, contact addresses, designations, and work experience)
- Addresses and mobile/telephone numbers
- Branch and location details
- Partners of the organization
- Web links to other company-related sites
- Background of the organization
- Web technologies
- News articles, press releases, and related documents
- Legal documents related to the organization
- Patents and trademarks related to the organization
Attackers can access organizational information and use such information to identigy key personeel and launch social engineering attacks to extract sensitive data about the entity.
- <u>Network information</u>: You can gather network information by performing whois database analysis, trace routing, ...
- Domain and subdomains
- Network blocks
- Network topology, trusted routers and firewalls
- IP addresses of the reachable systems
- Whois records
- DNS records and related information
- <u>System information</u>: You can gather system information by performing network footprinting, DNS footprinting, website footprinting, email footprining, ...
- Web Server OS
- Location of the web servers
- Public available email addresses
- Usernames, passwords, ...
<h3>
Footprinting Methodology
</h3>
<u>Footprinting Techniques</u>:
- <i>Footprinting through search engines.</i>
- <i>Footprinting through web services.</i>
- <i>Footprinting through social networking sites.</i>
- <i>Website footprinting.</i>
- <i>Email footprinting.</i>
- <i>Whois footprinting.</i>
- <i>DNS footprinting.</i>
- <i>Network footprinting.</i>
- <i>Footprinting through social engineerng.</i>
<h3>
Footprinting thorugh search engines
</h3>

Search engines are the main sources of key information about a target organization. They play a major role in extracting critical details abouta a target from the internet. Search engines use automated software, i.e. crawlers, to continously scan active websites and add the retrieved results in the search engine index that is further stored in a massive database. When a user queries the search engine index, it returns a list of Serch Engine Results Pages (SERPs). These results include web pages, videos, images, and many different file types ranked and displayed according to their relevance. Many search engines can extract target organization information such as technology platforms, employee details, logn pages, intranet portals, contact information, and so on. The information helps the attacker in perfroming social engineering and other typs of advanced stem attacks.
A Google search could reveal submissions to forums by security personnel, disclosing the brands of firewalls or antivirus software used by the trget. This information hlps the attacker in identifying vulnerabilities in such security controls.
<h3>
Footprinting Using Advanced Google Hacking Techniques</h3>
Google hacking refers to the use of advanced Google Search operatos for creating complex search queries to extract sensitive or hidden information. The accessed information is then used by attackers to find vulnerable targets. Footprint using advanced Google hacking techniques involves locating specific strings of text within search results using advanced operators in the Goolge Search engine.
Advanced Google hacking refers to the art of creating complex search engine queries. Queries can retrieve valuable data about a target company from Google search results. Through Google hacking, an attacker tries to find websites that are vulnerable to exploitation. Attackers can use the Google Hacking Database (GHDB), a database of queries, to identif sensitive data. Google operators help in finding the required text and avoiding irrelevant data. Using advanced Google operators, attackers can locate specific strings of text such as specific versions of vulnerable web applications. When a query without advanced search operators is specified, Google traces the search terms in any part of the webpage, including the title, text, URL, digital files, and so on. To confine a search, Google offers advanced search operators. These search operators help to confine a search, Google offers advanced search operators. These search operators help to narrow down the search query and obtain the most relevant and accurate output.
<h3>
What can a hacker do whith Google hacking?</h3>
An attacker can create complex search engine queries to filter large amounts of search results to obtain information related to computer security. The attacker uses Google operators that help locate specific strings of text within the search results. Thus, the attacker can not only detect websites and web servers that are vulnerable to exploitatiot bnut also locate private, sensitive information about others, such as credit card numbers, social security numbers, passwords, and so on. Once a vulnerable site is identified, attackers try to launch various possible attacks, such as buffer overflow and SQL Injection, which compromise information about security.
Examples of sensitive information on public servers that an attacker can extract with the help if Google Hacking Database (GHDB) queries include:
- <i>Error messages that contain sensitive information.</i>
- <i>Files containing passwords.</i>
- <i>Sensitive directories.</i>
- <i>Pages containing login portals.</i>
- <i>Pages containing network or vulnerability data, such as IDS, firewall logs, and configurations.</i>
- <i>Advisories and server vulnerabilities.</i>
- <i>Software version information.</i>
- <i>Web application source code.</i>
- <i>Connected IoT devices and their control panels, if unprotected </i>
- <i>Hidden web pages such as intranet and VPN services.</i>
Some popular Google advanced search operators include:
<b>site</b>: This operator restrics search results to the specified site or domain.
<b>allinurl</b> This operator restricts results to only pages containing all the query terms specified intheURL.
Example: [allinurl: google career]
<b>inurl</b>: This operator restricts the results to only the pages containing the specified word in the url.
Example: [inurl: copy site:www.google.com]
<b>allintitle</b>: This operator restricts results to only the pages containing all the query terms specified in the title.
Example: [allintitle: detect malware]
<b>intitle</b>: This operator restricts results to only the pages containing the specified term in the title.
Example: [malware detection intitle:help]
<b>inanchor</b>: This operator restricts results to only the pages containing the query terms specified in the anchor text on links to the page.
Example: [Anti-virus inanchor:Norton]
<b>allinanchor</b>: This operator restrcts results to only the pages containing all query terms specified in the anchor text on links to the pages.
Example: [allinanchor: best cloud service provider]
<b>cache</b>: This operator displays Google's cached version of a web page instead of th current version of the webpage.
Example: [cache:www.eff.org]
<b>link</b>: This operator searches websites or pages that contain links to the specified website or page.
Example: [link:www.googleguide.com]
<b>related</b>: This operaor displays websites that are similar or related to the URL specified.
Example: [related:www.microsoft.com]
<b>info</b>: This operator finds informtion for the specified web page.
Example: [info:gothotel.com]
<b>location</b>: This operator finds information for a specific location.
Example: [location: 4 seasons restaurant]
<b>filetype</b>: This operator allows you to search for the results based on a file extension.
Example: [jasmine:jpg]
Example: Google advance operator sysntax -> [intitle:intranet inurl:intranet intext: "human resources"]
<h3>
Google hacking database
</h3>
<i>Source</i>: https://www.exploit-db.com
The GHDB is an authoritative source for querying the ever-widening scope of the Google search engine.
Using GHDB dorks, attackers can rapidly identigy all the publicly available exploits and vulnerabilities of the target organization's I infrastructure
Google hacking categories:
- Footholds
- Files containing usernames
- Sensitive Directories
- Web server detection
- Vulnerable Files
- Vulnerable Servers
- Error Messages
- Files Containing Juicy info
- Files containing passowrds
- Sensitive online shopping information
- Network or vulnerability data
- Pages containing login portals
- Various online devices
- Advisories and Vulnerabilities
<h4>
VOIP and VPN Footprinting thorugh GHDP</h4>
<h4>
Google search queries for VOIP footprinting</h4>
<table>
<tr>
<th>Google Dork</th>
<th>Description</th>
</tr>
<tr>
<td>intile: "Login Page" intext "Phone Adapter Configuration Utility"</td>
<td>Pages containing login portals</td>
</tr>
<tr>
<td>inurl:/voice/advanced/ intitl:Linksys SPA configuration</td>
<td>Finds the Linksys VOIP router configuration page</td>
</tr>
<tr>
<td>intitle:"D-Link VoIP Router" "Welcome"</td>
<td>Pages containing D-Link portals</td>
</tr>
<tr>
<td>intitle:asterisk.management.portal web-access</td>
<td>Looks for the Asterisk management portal</td>
</tr>
<tr>
<td>intitle:"SPA504G Configuration"</td>
<td>Finds Cisco SPA504G Configuration Utility for IP phones</td>
</tr>
<tr>
<td>intitle:"Sipura.spa.Configuration" -.pdf</td>
<td>Finds configuration pages for online VOIP devices</td>
</tr>
<tr>
<td>intitle:asterisk.management.portal web-access</td>
<td>Finds the Asterisk web management portal</td>
</tr>
<tr>
<td>inurl:8080 intitle:"login" intext:"UserLogin" "English"</td>
<td>VOIP LOGIN PORTALS</td>
</tr>
</table>
<h4>
Google search queries for VPN footprinting</h4>
<table>
<tr>
<th>Google Dork</th>
<th>Description</th>
</tr>
<tr>
<td>filetype:pcf "cisco" "GroupPwd"</td>
<td>Cisco VPN files with Group Passwords for remote ccess</td>
</tr>
<tr>
<td>"[main]" "enc_GroupPwd=" ext:txt</td>
<td>Finds Cisco VPN client passwords (encrypted but easily cracked</td>
</tr>
<tr>
<td>"Config" intile:"Index of" intext:vpn</td>
<td>Directory with keys of VPN servers</td>
</tr>
<td>inurl:/remote/login?lang=en</td>
<td>Finds FortiGate Firewall's SSL-VPN login portal</td>
</tr>
<tr>
<td>!HOST=*.* intext:enc_UserPassword=* ext:pcf</td>
<td>Looks for profile configuration files (.pcf), which contain user VPN profiles</td>
</tr>
<tr>
<td>filetype:rcf inurl:vpn</td>
<td>Finds Sonicwall Global VPN client files containing sensitive information and login</td>
</tr>
<tr>
<td>filetype:pcf vpn OR Group</td>
<td>Finds publicly accesible .pcf used by VPN clients</td>
</tr>
<tr>
<td>vpnssl</td>
<td>Retrieves login portals containing vpnss comaines' access</td>
</tr>
<tr>
<td>intitle:"SSL VPN Service" + intext:"Your system administrator provided the following information to help understand and remedy the security conditions"</td>
<td>Finds Cisco asa login web pages</td>
</tr>
</table>


<h4>
Google search queries for FTP footprinting</h4>
FTP search engines are used to search for fils located on FTP servers that contain valuable information about the arget oganization. Many industries, intitutions, companies, and universities use FTP servers to storage large file archives and other software that are shared among their empoyees. A special client such as FileZilla can be used to access the FTP accounts; it also supports functionalities such as uploading, downloading and renaming files. Although FTP servers are usually protected with passwords, many servers are left unsecured and can be accessed through web browsers directly.
<table>
<tr>
<th>Google Dork</th>
<th>Description</th>
</tr>
<tr>
<td>inurl:github.com intext:.ftpconfig -issues</td>
<td>Returns SFTP/FTP server credentis on github</td>
</tr>
<tr>
<td>type:mil inurl:ftp ext:pdf | ps</td>
<td>Returns sensitive directories on FTP</td>
</tr>
<tr>
<td>intext:pure-ftpd.conf intile:index of</td>
<td>Returns servers exposing pure-ftpd configuration files</td>
</tr>
<tr>
<td>intitle:"Ined Of" intext:sftp-config.json</td>
<td>Extracts list of FTP/SFTP passwords from sublime text</td>
</tr>
<tr>
<td>inurl:"ftp://www." "Index of /"</td>
<td>Displays various online FTP servers</td>
</tr>
<tr>
<td>inurl~/ftp://193 filetype:(php | txt | html | asp | xml | cnf | sh) ~'/html'</td>
<td>Returns a list of FTP servers by IP address, mostly Windows NT servers with guest login capabilities</td>
</tr>
</table>
<h4>
Gathering Information from IoT Search Engines</h4>
internet of Things (IoT) search engines creawl the internet for IoT devices that are publicky accesible. Through a basic search on these seach engines, an attacker can gain control of Supervisory Control and Data Acquisition (SCADA) systems, traffic control systems, Internet-Connected household appliances, industrial appliances, CCTV cameras, etc.
Source: <i>https://www.shodan.io/</i>

<h3>
Footprinting thorugh web services</h3>
<h4>
Finding a Company's top-level domains (TLDs) and Sub-domains</h4>
A company's top-level domains (TLDs) and sub-domains can provide a large amount of useful information to anda ttacker. A public website is designed to hsow the presence of an organization on the internet. It is available for free public access. It is designed to attract customers and partners. It may contain information such as organizationlan history, services and products, and contact information. The target organziation's external URL can be located with the help of search as Google and Bing.

The sub-domain is available to only a few people. These persons may be employees of an organization or members of a department. In many organizations, website administrators create sub-domains to test new technologies before deployig them on the main website. Generally, these sub-domains are in the testing stage and are insecure; hence, they are more vulnerable to units in an orgaization. Identifying such sub-domains may reveal criticalinformation regardin the target, such as the source code of the website and documents of the webserver. Access restrictions can be applied based on the IP address, domain or subnet, username and password. The subdomains help to access the private functions of an organization. Most organizations use common formats for subdomains. Therefore a hacker who knows the external URL of a company can often discover the subdomains through trial anderro, or by using a service such as Netcraft.
You can also use the advanced Google Search Operator shown below to identify all the subdomains of the target:
site:microsoft.com -inurl:www

<h4>
Tools to search company's sub-domains</h4>
- Netcraft (<i>Source</i>: https://www.netcraft.com)
Netraft provides internet security services, including anti-fraud and anti-phishing services, application testing, and PCI scanning. They also analyze the market share of web servers, operating systems, hosting providers and SSL certificate authorities, and other parameters of the internet.
As shown in the screenshot below, attackers can use Netcraft to obtain all sub-domains related to the target domain.
- Sublist3r (<i>https://github.com/aboul3la/Sublist3r</i>)
Sublist3r is a python script designed to enumerate the subdomains of websites using OSINT. It enables you to enumerate subdomains accross multiple sources at once. Further, it helps penetration testers and bug hunters in collecting and gathering subdomains for the domain thet are targeting. It enumerates subdomains using many search engines such as Google, Yahoo, Bing, Baidu and Ask. It also enumerates subdmains using Netcraft, VirusTotal, ThreatCrowd, DNSdumpster, and reverseDNS.
- Pentest-Tools Find Subdomains (<i>Source: </i>https://pentest-tools.com)
Pentest-Tools Find Subdomains is an online tool used for discovering subdomains and their IP address, incl¡uding network information and their HTTP servers.
<h4>
Finding the Geographical Location of the Target</h4>
Information such as the physical location of an organization plays a vital role in the hacking process. Atttackers can obtain this information using footprinting. In addition to physical location, a hacker can also acquire information such as surrounding public wifi hotspot that may offer a way to break into the target organization's network.
<h4>
Tools for Finding the Geographical Location</h4>
- Google Earth (<i>Soruce</i>: https://earth.google.com)
<h4>
People search on social networking sites</h4>
Searching for a particular person on a social networking website is fairly easy. Social networking services are online services, platfroms, or sites that focus on facilitating the building of social networks or social relations among people. These websites contain information that users provide in their profiles. They help to directly or indirectly relate people to each other through various fields such as common interests, work location, and education.

Social networking sites allow people to share information quiclly, as they can update their personal details in real time. Such sites allow users to update facts about upcoming or current events, recent announcements and inventations, and so on. Social netwoirking sites allow visitors to search for people without registering on the site; this makes people searching on social networking sites an easy and nonymous task. A user can serarch for a person using the name, email, or address. Some sites allow users to check whether an account is active, which then provides information on the status of the person being searched.
Social networking sites such as Facebook, twitteer, linkedin and instagram allow you to find people by name, keyword, company, school, ... Searching for people on these sites returns personal information such as name, position, organization name, current location, and educational qualifications. In addition, you can also find professional information such as company or business, current location, phone number, email iD, photos, videos and so on. Social networking sites such as Twitter are used to share advice, news, concerns, opinions, rumors, and facts. Thorugh people searching on social networking services, an attacker can gather criticas information that will help them in perfroming social engineering or other kinds of attacks.

<h4>
People search on people search services</h4>
You can use public records websites to find information about email addresses, phone numbers, house addresses, ... Many individuals use online people search services to find information about other people. Generally, online people search services, such as <b>pipl, intelius, beenverified, whitepages, peekyou</b>, provide people's name, addresses, contact, ...
Further, online people search services may ofter reveal the profession of an individual, bussiness owned by a person, ...
This information proves to be highly beneficial for attackers to launch attacks. There are many available online people search services that help in obtaining information regarding people.
- People search services - Intelius (<i>Source:</i> https://www.intelius.com/)
<h4>
Gathering information from LinkedIn</h4>

Linkedin is a social networking website for professionals. It connects the world's human resources to aid productivity and success. The site contains personal information such as name, position, organization name, ...
Attackers can use theHarvester tool to gather information about linkedin based on the target organization name:
- theHarvester (<i>Source:</i> http://www.edge-security.com)
````bash
theHarvesterr -d microsoft -l 200 -b linkedin
````
<h4>
Harvesting email lists</h4>
Gathering email address related to the target organization acts as an important attack vector during the later phases of hacking. Attackers can use automated totols such as theHarvester and email spider to collect publicly available email addresses of the employees of the target organization.
- theHarvester (<i>Souce</i>: http://www.edge-security.com)
````bash
theHarvester -d microsoft.com -l 200 -b baidu
````
<h4>
Gathering Information from Financial Services</h4>
Attackers who seek access to personal information or financial information ofter target financial data such as stock quotes and charts, financial news, and portoflios. Financial services such as Google finance, MSN Money, Yahoo Finance, ... can provide a large amount of useful information such as the market value of a company's shares, compay profile, ... The information provided varies from one service to the other. Financial firms rely on web servicdes to perfrom transactions and grant users access to their accounts. Attackers can obtain sensitive and private information regarding these films by using malware, exploiting software desing falws, breaking authentication mechanisms, service flooding , and performing brute force attacks and phising attacks.
- Google Finance (<i>Source</i>: https://www.google.com/finance)

<h4>
Footprinting thorugh job sites</h4>
Attackers can gather valuable information about the Operating System, software versions,... Many organizations' websites provide recruiting information on a job posting page that, and techologies used by the company. In addition,the website may have a key employee list with email addresses. Such imformation may prove o be beneficial for an attacker. For example, if an ogranization advertises a Network Administrator job, it posts the requirements related to that position,


<h4>
Deep and dark web footprinting</h4>

The surface web is the outer layer of the online cyberspace that allows the user to find web pages and content using regular web browsers. Search engines use crawlers that are programmed bots to access anb download web pages.
The deep web is the layer of the online cyberspace that consists of web pages and content that are hidden and unindexed. Such content cannot be located using traditional web browsers and search engines. The size of the deep web does not allow the crawling process of basic search engines. It consists of official government or federal databases and other information linked to various organizations. The deep web can be accessed using search engines such as Tor Browser and the WWW Virtual Library. It can be used for both legal and illegal activities.
The dark web or Darknet is a deeper layer of the online cyberspace, and it is the subset of the deep web that enables anyonee to navigate anonymously without being traced. The dark web can be accessed only though specialized tools or darknet browsers. Attackers primarily use the dark web to perfrom footprinting on the target organziation and launch attacks. The dark web can be accessed using search engines such as Tor Browser and ExoneraTor.
- Tor Browser (<i>Source</i>: https://www.torproject.org)
<h4>
Determining the operating system</h4>

Attackers use various online tools such as NetCraft, shodan and censys to detect the operating system used at the target organization. These tools search the internet for detecting connected devices such as routers, servers, and IoT devices belonging to the target organization. Using these tools, attackers ontain information such as the city, country, latitude/longitude, hostname, operating system and IP addresses of the organization. Such information further helps attackers in identifying potential vulnerabilities and finding effective exploits to perform various attacks on the target
- Netcraft (<i>Soruce</i>: https://www.netcraft.com)

- Shodan (<i>Source</i>: https://www.shodan.io)

- Censys (<i>Source</i>: https://censys.io)
<h4>
Voip and VPN footprint through shodan</h4>

<h2>
Competitive intelligence
</h2>
<h3>
Competitive Intelligence Gathering
</h3>


Competitive intelligence gathering is the process of identifying, gathering, analyzing, verifying and using information about yout competitors from resources such as the internet. Competitive intelligence means understanding and learning about other businesses to become as competitive as possible. It is non-interfering and subtle in nature compared to direct intellectual property theft carried out via hacking or industrial espionage. It focuses on the external business environment. In this method, professionals gather information ethically and legally instead of gathering it secretly.
Competitive intelligence helps in determining:
- What the competitors are doing?
- How competitors are positioning their products and services?
- What customers are saying about competitor's strengths and weaknesses?
Companies carry out competitive intelligence either by employing people to search for information or by utilizing a commercial database service, which involves lower costs. The information that is gathered can help the managers and executives of a company make strategic decissions.
<h3>
Sources of Competitive Intelligence
</h3>
Competitive intelligence gathering can be performed using direct or indirect approach.
- <b>Direct approach:</b> The direct approach serves as the primary source for competitive intelligence gathering. Direct approach techniques include gathering information from trade shows, social engineering of employees and customers, and so on.
- <b>Indirect approach:</b> Thorugh indirect approach, information about competitors is gathered using online resources. Indirect approach techniques include:
- <i>Company websites and employment ads</i>
- <i>Support threads and reviews</i>
- <i>Search engines, Internet, and online database</i>
- <i>Social media postings</i>
- <i>Press releases and annual reports</i>
- <i>Trade journals, conferences, and newspapers</i>
- <i>Patent and trademarks</i>
- <i>Analyst and regulatory reports</i>
- <i>Customer and vendor interviews</i>
- <i>Agents, distributors, and suppliers</i>
- <i>Industry-specific blogs and publications</i>
- <i>Legal databasses, e.g., LexisNexis</i>
- <i>Bussiness information databases, e.g., Hoover's</i>
- <i>Online job postings</i>
<h3>
Competitive intelligence - When did this company begin? How did it develop?
</h3>
Gathering competitor documents and records helps to improve productivity and profitability, which in turn simulates the growth of the company. It helps in determining answers to the following:
- <i>When did it begin?</i>: Through competitive intelligence, companies can collect the history of a particular company, such as its estabishment date. Sometimes, they gather crucial information that is not often available to others.
- <i>How did it develop</i>: What are the various strategies that the company uses? Development intelligence can include advertisement strategies, customer relationship management, and so on.
- <i>Who leads it?</i>: This information helps a company learn about the competitor's decision makers.
- <i>Where is it located?</i>: Competitive intelligence also includes the location of the company and information related to various branches abnd their operations.
Attackers can use the information gathered thorugh competitive intelligence to build a hacking strategy.
<h3>
Information Resources Site
</h3>
Information resources sites that help to gain competitive intelligence include:
- EDGAR Database (<i>Soruce:</i> https://www.sec.gov/edgar.shtml)
- D&B Hoovers (<i>Source:</i> http://www.hoovers.com)
- LexisNexis (<i>Source:</i> https://www.lexisnexis.com)
- Business Wire (<i>Source:</i> https://www.businessswire.com)
- Factiva (<i>Source</i> https://www.dowjones.com)
<h3>
Competitive Intelligence - What are the company's plans?
</h3>
Information resouce sites that help attackers gain a company's business plans include: