Bhupen
Add Deep Learning intro py file
ea55009
import streamlit as st
import numpy as np
import pandas as pd
from sklearn.datasets import fetch_california_housing
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.linear_model import SGDRegressor
from sklearn.metrics import mean_squared_error, mean_absolute_error, r2_score
# Plot model architecture
from tensorflow.keras.utils import plot_model
from PIL import Image
import os
# Main function to run the Streamlit app
def main():
# Title of the app
st.subheader("Understanding Deep Learning")
# Expander for Deep Learning Introduction
with st.expander(f"➑️ What is Deep Learning?"):
st.write("""
**Definition (in simple terms):**
Deep learning is a type of machine learning that uses artificial neural networks to learn from large amounts of data. It mimics the way humans think and process information, allowing machines to make decisions based on that data.
**How it is different from Classical Machine Learning:**
- **Classical ML:** Relies on `feature engineering` - column selection. It uses algorithms like decision trees, SVM, or linear regression.
- **Deep Learning:** Automatically learns features from data without needing explicit programming. It uses layers of interconnected nodes (neurons) to learn complex patterns.
**Examples of Success Stories:**
- **Image Recognition:** Convolutional Neural Networks (CNNs) powering face recognition on social media and medical imaging diagnostics.
- **Natural Language Processing:** Transformers like `GPT`, `BERT`, and `T5`, which are used in chatbots, translation apps, and content generation.
- **Autonomous Vehicles:** Deep learning models help self-driving cars make decisions like braking, steering, and avoiding obstacles.
**Is `Generative` AI the same as Deep Learning?**
- **Generative AI** is a subset of deep learning that focuses on generating new content, like images, text, or music. It uses models like GANs (Generative Adversarial Networks) and VAEs (Variational Autoencoders). While all generative AI is deep learning, not all deep learning is generative AI.
""")
with st.expander("➑️ How Features Are Learned: ML vs DL (Image Data Example)"):
st.markdown("###### 🎯 Task: Classify handwritten digits (0–9)")
st.markdown("###### πŸ” Classical ML Approach")
st.markdown("""
In classical ML, we **extract features manually** from each image.
**Example features:**
- `pixel_mean`: Average pixel brightness
- `num_white_pixels`: Count of bright pixels
- `aspect_ratio`: Width-to-height of the digit
- `num_edge_pixels`: From an edge detection filter
These features are then used in algorithms like SVM or decision trees.
""")
st.markdown("###### 🧠 Deep Learning Approach")
st.markdown("""
In Deep Learning, we **skip manual features**. Instead, the model receives the raw pixel matrix (like 28x28 for MNIST).
- The model **learns features** automatically: curves, loops, corners, even digit shapes.
- Layers in a CNN extract **increasingly abstract patterns**.
> Think of it as: *We provide raw material, and the model forges patterns on its own.*
""")
st.markdown("###### 🧩 Analogy")
st.markdown("""
| Task | Classical ML | Deep Learning |
|------|---------------|----------------|
| Image of Digit | Engineer features manually from pixels | Feed raw pixels, model learns patterns |
| You’re Like | Handcrafting parts of a watch | Giving raw metal, and model forges its own design |
""")
# Expander for How Linear Regression Expands to Neural Networks
with st.expander("➑️ How is Simple Linear Regression Expanded to Neural Network?"):
st.write("""
**Conceptually Expanding Linear Regression to Neural Network:**
- **Linear Regression:** Models a relationship between input (X) and output (Y) using a straight line (Y = mX + b). It’s a simple case of predicting continuous values based on one or more features.
- **Neural Networks:** Expand this idea by adding layers of neurons, where each neuron performs its own version of "linear regression" but with the output of each neuron being passed through an activation function (like ReLU or sigmoid) before being sent to the next layer.
**Illustration:**
- For **Linear Regression**, think of a single equation `y = mX + b`.
- For a **Neural Network**, imagine adding layers of neurons. Each neuron applies a weighted sum, then an activation function. The model learns through backpropagation, adjusting weights to minimize the error over time.
""")
nn_image_url = "https://raw.githubusercontent.com/gridflowai/gridflowAI-datasets-icons/ab018053d82052aa9b7ce1b0e9a76410663af734/AI-icons-images/NN.png"
neuron_image_url = "https://raw.githubusercontent.com/gridflowai/gridflowAI-datasets-icons/ab018053d82052aa9b7ce1b0e9a76410663af734/AI-icons-images/neuron.png"
st.image(neuron_image_url, caption="Linear regression - single neuron unit", use_column_width ='auto')
st.image(nn_image_url, caption=" Neural Network Illustration", use_column_width ='auto')
# Expander for Perceptron Regression on California Housing Dataset
with st.expander("➑️ Linear Regression on California Housing Dataset"):
st.write("""
The Perceptron model is applied to the California Housing dataset for regression. We will train the model and evaluate it using several metrics.
""")
# Load the California housing dataset
data = fetch_california_housing()
df = pd.DataFrame(data.data, columns=data.feature_names)
df['target'] = data.target
# Split the data into features and target
X = df.drop('target', axis=1)
y = df['target']
# Split the dataset into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Standardize the features using StandardScaler
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
# Train a Perceptron model
model = SGDRegressor(max_iter=1000, tol=1e-3)
model.fit(X_train, y_train)
# Make predictions
y_pred = model.predict(X_test)
# Display the regression metrics
mse = mean_squared_error(y_test, y_pred)
mae = mean_absolute_error(y_test, y_pred)
r2 = r2_score(y_test, y_pred)
st.write(f"Mean Squared Error (MSE): {mse:.2f}")
st.write(f"Mean Absolute Error (MAE): {mae:.2f}")
st.write(f"R-squared (R2): {r2:.2f}")
# Expander for Improving the Model with Deep Learning
with st.expander("➑️ Improve the Model Using Deep Learning (Keras)"):
st.write("""
We can improve our Perceptron model using a deeper neural network with multiple neurons and layers. Here we use a Keras sequential model for a better regression result.
""")
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# Build the Keras model
deep_model = Sequential([
Dense(64, input_dim=X_train.shape[1], activation='relu'),
Dense(32, activation='relu'),
Dense(1)
])
deep_model.compile(optimizer='adam', loss='mse')
# Train the deep learning model
deep_model.fit(X_train, y_train, epochs=50, batch_size=200, verbose=1)
# Make predictions using the deep models
y_pred_deep = deep_model.predict(X_test)
# Evaluate the deep learning model
mse_deep = mean_squared_error(y_test, y_pred_deep)
mae_deep = mean_absolute_error(y_test, y_pred_deep)
r2_deep = r2_score(y_test, y_pred_deep)
st.write(f"Mean Squared Error (Deep Model MSE): {mse_deep:.2f}")
st.write(f"Mean Absolute Error (Deep Model MAE): {mae_deep:.2f}")
st.write(f"R-squared (Deep Model R2): {r2_deep:.2f}")
# Show the model summary
# st.markdown("**Deep Learning Model Summary**")
# plot_path = "deep_model_plot.png"
# plot_model(deep_model, to_file=plot_path, show_shapes=True, show_layer_names=True)
# # Display the plot in Streamlit
# if os.path.exists(plot_path):
# image = Image.open(plot_path)
# st.image(image, caption="Keras Sequential Model Architecture", use_column_width='auto')
# else:
# st.warning("Model plot could not be generated. Ensure pydot and graphviz are installed.")
with st.expander("πŸ“Š Model Performance Comparison and Key Improvements"):
st.markdown("###### πŸ”‘ Key Notes on Model Improvement")
st.markdown("""
###### πŸ“‰ 1. **Error Reduction**
- **MSE dropped** from `0.78 β†’ 0.29`
β†’ Deep model makes **fewer big mistakes** (squared errors).
- **MAE dropped** from `0.58 β†’ 0.38`
β†’ Predictions are **closer to true values on average**.
###### πŸ“ˆ 2. **Higher Accuracy (RΒ² Score)**
- **RΒ² improved** from `0.41 β†’ 0.78`
β†’ Deep model explains **78% of the variation** in house prices
β†’ vs. only 41% by the Linear regression.
###### 🧠 3. **Why the Improvement?**
- **Perceptron** learns only **linear patterns**.
- **Deep Learning model** has multiple layers + nonlinear activations (`relu`), so it:
- Learns **complex, non-linear relationships**
- Detects **feature interactions** better
###### πŸ–ΌοΈ 4. **Visual Analogy**
- Think of fitting a **straight line** (Linear regression) vs. a **flexible curve** (Deep Net).
- The curve adapts better to the **true shape of the data**.
""")
st.success("Deep learning brings a significant performance boost by capturing more complex relationships in the data.")
if __name__ == "__main__":
main()