+91 9873530045
admin@learnwithfrahimcom
Mon - Sat : 09 AM - 09 PM

Day 9: APIs, Databases, and Cloud Connectivity


Step 1: Python Environment Setup

Verify Python installation:

python --version

Expected Output:

Python 3.12.4

Install required packages:

pip install requests pandas sqlalchemy pymysql sqlite3
  • requests - To call APIs
  • pandas - Data manipulation
  • sqlalchemy - Database connection
  • pymysql - MySQL connector for cloud SQL
  • sqlite3 - Local lightweight database

Step 2: Local Database Setup (SQLite)

SQLite is built into Python. No extra installation needed. Steps:

  1. Open Python or IDE (PyCharm/VSCode).
  2. Create a new file local_db.py.
  3. Use this code to create and verify DB:
  4. import sqlite3 # Import sqlite3 module

    # Connect (creates database if it doesn't exist)
    conn = sqlite3.connect("weather_data.db")
    c = conn.cursor()

    # Create table if it doesn't exist
    c.execute('''CREATE TABLE IF NOT EXISTS Weather (City TEXT, Temperature REAL, Description TEXT)''')

    # Verify tables in database
    c.execute("SELECT name FROM sqlite_master WHERE type='table';")
    print("Tables:", c.fetchall())
    conn.close()

    Expected Output:

    Tables: [('Weather',)]

Step 3: GCP Account & Cloud SQL Setup

Steps to setup cloud database:

  1. Go to GCP Console and sign in with your Gmail account.
  2. Create a new project: click IAM & Admin → Create Project. Give it a name and note the Project ID.
  3. Enable Cloud SQL API: Navigation Menu → APIs & Services → Library → Search Cloud SQL API → Enable.
  4. Create a Cloud SQL instance: Navigation Menu → SQL → Create Instance → Choose MySQL.
  5. Set root password and note instance connection name (needed for Python connection).
  6. Create the Database `weatherdb`:
    1. Go to your Cloud SQL instance → click Databases tab.
    2. Click Create Database.
    3. Enter database name: weatherdb. Keep default charset utf8mb4 and collation utf8mb4_general_ci.
    4. Click Create. Verify that weatherdb appears in the database list.
    5. Optional: Use SQL command to verify:
      mysql -h <PUBLIC_IP> -u <USERNAME> -p
      SHOW DATABASES;
    6. Optional: Grant privileges to user:
      GRANT ALL PRIVILEGES ON weatherdb.* TO 'your_user'@'%';
      FLUSH PRIVILEGES;
  7. Allow your local IP in Connections → Add Network (for testing only, use public IP).
  8. Optional: Install Cloud SQL Proxy for secure local connections: Instructions.

Step 4: API Usage

Steps to get API key for OpenWeatherMap:

  1. Go to OpenWeatherMap API and create a free account.
  2. Navigate to API keys section and generate a new key.
  3. Copy the key and keep it safe (will be used in Python code).

Python code to fetch weather data (with comments):

import requests # For calling APIs

api_key = "YOUR_API_KEY" # Replace with your API key
city = "Dallas"
url = f"http://api.openweathermap.org/data/2.5/weather?q={city}&appid={api_key}&units=metric"

# Call the API
response = requests.get(url)
print("Status code:", response.status_code) # 200 = Success
data = response.json() # Convert JSON response to Python dict
temperature = data['main']['temp'] # Extract temperature
weather_desc = data['weather'][0]['description'] # Extract weather description
print(f"Temperature: {temperature}°C, Condition: {weather_desc}")

Step 5: Store API Data Locally

conn = sqlite3.connect("weather_data.db") # Connect local DB
c = conn.cursor()

# Insert API data into table
c.execute("INSERT INTO Weather VALUES (?,?,?)", (city, temperature, weather_desc))
conn.commit() # Save changes

# Verify stored data
c.execute("SELECT * FROM Weather")
print(c.fetchall())
conn.close()

Step 6: Connect to Cloud SQL using Python

Python code to connect and insert/fetch cloud data (with comments):

from sqlalchemy import create_engine # For DB connection
import pandas as pd

user = 'your_user'
password = 'your_password'
host = 'YOUR_PUBLIC_IP'
db = 'weatherdb'

# Create connection engine
engine = create_engine(f'mysql+pymysql://{user}:{password}@{host}/{db}')

# Create dataframe to send to cloud
df = pd.DataFrame({'City':[city], 'Temperature':[temperature], 'Description':[weather_desc]})
df.to_sql('Weather', con=engine, if_exists='append', index=False) # Insert data

# Fetch data from cloud
df_cloud = pd.read_sql("SELECT * FROM Weather", con=engine)
print(df_cloud)

Step 7: Practical Example: API → Local DB → Cloud DB → Analysis

Action breakdown:

  • Local Laptop: Python fetches API, stores in SQLite, prepares dataframe.
  • Cloud: Dataframe uploaded to Cloud SQL, query and fetch results.
  • Local Analysis: Pandas reads cloud data, calculates statistics, prints results.

Python code with comments:

# Fetch latest data from Cloud
df = pd.read_sql("SELECT * FROM Weather", con=engine)
print("Data from Cloud:")
print(df)

# Local analysis
avg_temp = df['Temperature'].mean() # Calculate average temperature
print(f"Average Temperature: {avg_temp}°C")

Expected Output:

Data from Cloud:
City | Temperature | Description
Dallas | 34 | clear sky
Average Temperature: 34.0°C

Step 8: Best Practices

  • Store API keys securely (environment variables or GCP Secret Manager)
  • Always check response.status_code for API success
  • Use parameterized queries to prevent SQL injection
  • Use Pandas to clean and transform data before inserting to DB
  • For training, personal Gmail accounts are fine; production requires proper IAM setup
✔ End of Day 9 – You now know how to fetch data from APIs, store locally and in cloud databases, connect to GCP, and analyze data using Python. Full setup and all steps covered with comments and explanations.