How to Write Scraped Data into CSV File in Python?
Share
Condition for Write Scraped Data into CSV File in Python
Description:
Writing scraped data to a CSV file in Python involves extracting information from a website using
web scraping techniques (like requests and BeautifulSoup), and saving the structured data into a CSV
file using Python's csv module.
Step-by-Step Process
Send HTTP Request:
Use the requests library to send an HTTP GET request to the target website and retrieve its HTML content.
Parse the HTML:
Use BeautifulSoup to parse the HTML content and find the data you want to extract.
Open CSV File:
Open a CSV file in write mode using Python's csv module.
Write Data to CSV:
Use the csv.writer() object to write the extracted data (e.g., quotes, authors) row by row into the CSV file.
Close File:
Ensure the file is saved properly after writing all the data.
Sample Code
import requests
from bs4 import BeautifulSoup
import csv
url = 'https://quotes.toscrape.com/'
response = requests.get(url)
soup = BeautifulSoup(response.text, 'html.parser')
quotes = soup.find_all('span', class_='text')
authors = soup.find_all('small', class_='author')
with open('scraped_quotes.csv', 'w', newline='', encoding='utf-8') as file:
writer = csv.writer(file)
writer.writerow(['Quote', 'Author'])
for quote, author in zip(quotes, authors):
writer.writerow([quote.text, author.text])
print("Data has been written to 'scraped_quotes.csv'")