Step 1: Project Overview
We will build a **Weather Automation Project**:
- Fetch real-time weather data from OpenWeatherMap API
- Store it locally in SQLite for backup
- Upload to Cloud SQL on GCP
- Analyze and generate daily reports (CSV & plots)
- Optional: Send automated email with report
Step 2: Project Folder Structure
Create a folder like:
C:/PythonTutorial/Day10_Project/
│
├── data/
│ └── weather_backup.db
├── reports/
│ └── daily_report.csv
├── main.py
├── config.py
└── requirements.txt
Explanation: data/ stores local DB, reports/ stores CSV output, main.py is main script, config.py stores API keys and DB credentials.
Step 3: Install Dependencies
pip install requests pandas sqlalchemy pymysql matplotlib seaborn
Explanation: requests for API calls, pandas for data handling, sqlalchemy+pymysql for cloud DB, matplotlib & seaborn for plots.
Step 4: Configuration File (config.py)
API_KEY = "YOUR_OPENWEATHER_API_KEY" # API key
CITY = "Dallas"
SQL_USER = "your_user"
SQL_PASSWORD = "your_password"
SQL_HOST = "YOUR_PUBLIC_IP"
SQL_DB = "weatherdb"
Step 5: Main Automation Script (main.py)
import requests
import sqlite3
import pandas as pd
from sqlalchemy import create_engine
import matplotlib.pyplot as plt
import seaborn as sns
from config import API_KEY, CITY, SQL_USER, SQL_PASSWORD, SQL_HOST, SQL_DB
# 1️⃣ Fetch weather data from API
url = f"http://api.openweathermap.org/data/2.5/weather?q={CITY}&appid={API_KEY}&units=metric"
response = requests.get(url)
data = response.json()
temperature = data['main']['temp']
weather_desc = data['weather'][0]['description']
print(f"Weather in {CITY}: {temperature}°C, {weather_desc}")
# 2️⃣ Store locally in SQLite
conn = sqlite3.connect("data/weather_backup.db")
c = conn.cursor()
c.execute('''CREATE TABLE IF NOT EXISTS Weather
(City TEXT, Temperature REAL, Description TEXT)''')
c.execute("INSERT INTO Weather VALUES (?,?,?)", (CITY, temperature, weather_desc))
conn.commit()
conn.close()
# 3️⃣ Upload to Cloud SQL (MySQL)
engine = create_engine(f'mysql+pymysql://{SQL_USER}:{SQL_PASSWORD}@{SQL_HOST}/{SQL_DB}')
df = pd.DataFrame({'City':[CITY], 'Temperature':[temperature], 'Description':[weather_desc]})
df.to_sql('Weather', con=engine, if_exists='append', index=False)
# 4️⃣ Read cloud data and generate daily report
df_cloud = pd.read_sql("SELECT * FROM Weather", con=engine)
df_cloud.to_csv("reports/daily_report.csv", index=False)
# 5️⃣ Plot temperature trend
sns.set(style="whitegrid")
plt.figure(figsize=(8,5))
sns.lineplot(data=df_cloud, x=df_cloud.index, y="Temperature", marker="o")
plt.title("Temperature Trend")
plt.xlabel("Record Number")
plt.ylabel("Temperature (°C)")
plt.savefig("reports/temperature_trend.png") # Save plot
plt.show()
Explanation: The script performs API fetch, local backup, cloud upload, report generation, and plotting in one automated workflow.
Step 6: Run Project
Run the script in command line or IDE:
python main.py
Expected output:
- Print weather data on console
- Store in
data/weather_backup.db
- Upload to cloud table
Weather
- Create
reports/daily_report.csv and reports/temperature_trend.png
Step 7: Optional – Email Automation
Send daily report automatically using smtplib:
import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
sender = "you@gmail.com"
receiver = "recipient@gmail.com"
subject = "Daily Weather Report"
body = "Please find attached the daily weather report."
msg = MIMEMultipart()
msg['From'] = sender
msg['To'] = receiver
msg['Subject'] = subject
msg.attach(MIMEText(body, 'plain'))
# Attach CSV
with open("reports/daily_report.csv", "r") as f:
attachment = MIMEText(f.read())
attachment.add_header('Content-Disposition', 'attachment', filename="daily_report.csv")
msg.attach(attachment)
# Send email
server = smtplib.SMTP('smtp.gmail.com', 587)
server.starttls()
server.login(sender, "YOUR_APP_PASSWORD")
server.send_message(msg)
server.quit()
Explanation: Uses Gmail SMTP to send report automatically. For Gmail, generate an App Password for security.
Step 8: Project Cheat Sheet
- API fetch: requests.get(url), response.json()
- SQLite: sqlite3.connect(), cursor.execute(), commit()
- Cloud SQL: SQLAlchemy engine, df.to_sql(), pd.read_sql()
- Pandas: df.to_csv(), df.head(), df.describe()
- Visualization: seaborn.lineplot(), plt.savefig(), plt.show()
- Email: smtplib.SMTP(), MIMEMultipart(), MIMEText(), server.send_message()
✔ End of Day 10 – Project complete! You now have an automated Python workflow fetching APIs, storing locally and in cloud, analyzing data, plotting, and optionally emailing daily reports.