Web scraping is a process or method to extract data from different websites such that it looks natural. In the code given below, randomize user agents and IP addresses for performing web scraping by using Python language. The random module is used in Python for web scraping [2]. Python code: ```python import random # The below is the list of user agents[1] user_agents_list = [ "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:54.0) Gecko/20100101 Firefox/54.0", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.79 Safari/537.36 Edge/14.14393", "Mozilla/5.0 (Windows NT 10.0; WOW64; Trident/7.0; rv:11.0) like Gecko", "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.96 Safari/537.36 OPR/45.0.2552.888", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.76 Safari/537.36", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/57.0.2987.98 Safari/537.36", ] # Now randomly select the user agent user_agent = random.choice(user_agents_list) # display the selected user agent [3][4] print("User Agent:", user_agent) # The below is the list of IP addresses[5] ip_address_list = [ "192.168.0.1", "192.168.0.2", "192.168.0.3", "192.168.0.4", "192.168.0.5", "192.168.0.6", "192.168.0.7", "192.168.0.8", "192.168.0.9", "192.168.0.10", ] # Use the random.choice function to randomly select an IP address ip_address = random.choice(ip_address_list) # Now,display the selected IP address print("IP Address:", ip_address) ``` Description: * Create a list of user agents and store them in the variable `user_agents_list`. These User agents are strings that provide identification for the client software accessing a website. The user agent is selected randomly from this list, by which simulation of various web browsers or devices becomes more effective. * Utilize the `random.choice()` function to randomly pick a user agent from the list, and display the selected user agent by using the `print()` function. * Similarly, prepare a list of the IP address and store it in a variable using the `random.choice()` function to select the IP address randomly from the list. It will help to choose the different locations. Randomness is important for web scrapping as it increases the authenticity and reduces the chance of detection [6]. Alternative approach: Introduce randomness to user agents and IP addresses in Python for web scraping by incorporating libraries such as `requests` and `fake_useragent` to manage user agents, and by implementing rotating proxies to handle IP addresses. 1. Installation of Required Libraries ``` pip install requests fake_useragent requests-rotating-proxy ``` 2. Importing the Essential Modules: ``` import requests from fake_useragent import UserAgent from requests_rotating_proxy import RotatingProxySession ``` 3. Creating a Function to Generate Random User Agents ``` def get_random_user_agent(): user_agent = UserAgent() return user_agent.random ``` 4. Creating a Function for Making Requests with Random User Agents: ``` def make_request(url): user_agent = get_random_user_agent() headers = {'User-Agent': user_agent} session = RotatingProxySession() session.headers.update(headers) try: response = session.get(url) response.raise_for_status() return response.text except requests.exceptions.RequestException as e: print(f"Request failed: {str(e)}") return None ``` 5. Utilizing the 'make_request' Function for Scraping: ``` if __name__ == "__main__": target_url = "https://example1.com" for _ in range(6): # Make 6 requests html = make_request(target_url) if html: print(html[:101]) # Print the first 101 characters. ```