Sadjad Alikhani
commited on
Commit
•
00c2b98
1
Parent(s):
e853040
Update README.md
Browse files
README.md
CHANGED
@@ -1,37 +1,45 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
**[🚀 Try the Interactive Demo on Hugging Face!](https://huggingface.co/spaces/sadjadalikhani/LWM-Interactive-Demo)**
|
4 |
|
5 |
-
|
6 |
|
7 |
-
|
8 |
|
9 |
-
|
10 |
|
11 |
-
|
12 |
-
- **Wireless Channel Embeddings:** LWM is trained to extract meaningful embeddings from wireless channel data, capturing complex features that can be used in various downstream tasks.
|
13 |
-
- **Flexible Input:** Whether you're working with raw channel data or compressed embeddings, LWM supports different data formats, offering versatility in wireless data processing.
|
14 |
-
- **Efficient Inference:** LWM's architecture is optimized for quick and scalable inference, providing fast results even on large datasets.
|
15 |
-
- **Generalization Power:** Tested on several unseen datasets, LWM maintains high-quality performance without overfitting, proving its effectiveness in diverse environments.
|
16 |
|
17 |
---
|
18 |
|
19 |
## 🛠 **How to Use**
|
20 |
|
21 |
-
### 1. **Install Conda or Mamba**
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
22 |
|
23 |
-
|
24 |
|
25 |
-
|
26 |
-
- **Mamba (via Miniforge):** [Miniforge](https://github.com/conda-forge/miniforge/releases/latest) is a faster alternative to Conda, with **Mamba** pre-installed for quicker package installations.
|
27 |
|
28 |
---
|
29 |
|
30 |
-
### 2. **
|
|
|
|
|
31 |
|
32 |
#### **Step 1: Create a new environment**
|
33 |
|
34 |
-
|
35 |
|
36 |
```bash
|
37 |
# If you're using Conda:
|
@@ -43,31 +51,35 @@ mamba create -n lwm_env python=3.12
|
|
43 |
|
44 |
#### **Step 2: Activate the environment**
|
45 |
|
46 |
-
Activate the
|
47 |
|
48 |
```bash
|
|
|
49 |
conda activate lwm_env
|
50 |
```
|
51 |
|
52 |
---
|
53 |
|
54 |
-
|
55 |
|
56 |
-
Install the necessary
|
57 |
|
58 |
```bash
|
59 |
-
#
|
60 |
conda install pytorch torchvision torchaudio -c pytorch
|
|
|
61 |
|
62 |
-
#
|
|
|
63 |
pip install -r requirements.txt
|
64 |
```
|
65 |
|
66 |
-
> **Note
|
67 |
|
68 |
-
**Main package requirements include:**
|
69 |
```python
|
70 |
import torch
|
|
|
|
|
71 |
import numpy as np
|
72 |
import pandas as pd
|
73 |
import DeepMIMOv3
|
@@ -78,13 +90,59 @@ import warnings
|
|
78 |
from tqdm import tqdm
|
79 |
from datetime import datetime
|
80 |
from torch.utils.data import Dataset, DataLoader
|
|
|
|
|
81 |
```
|
82 |
|
83 |
---
|
84 |
|
85 |
-
###
|
|
|
|
|
86 |
|
87 |
```python
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
88 |
# Step 1: Clone the model repository (if not already cloned)
|
89 |
model_repo_url = "https://huggingface.co/sadjadalikhani/lwm"
|
90 |
model_repo_dir = "./LWM"
|
@@ -98,7 +156,7 @@ if not os.path.exists(model_repo_dir):
|
|
98 |
|
99 |
### 5. **Clone the Desired Datasets**
|
100 |
|
101 |
-
|
102 |
|
103 |
📊 **Dataset Overview**
|
104 |
|
@@ -106,28 +164,61 @@ LWM is designed to process datasets from various environments, such as the **Dee
|
|
106 |
|----------------|----------------------|------------------------|------------------------------------------------------------------------------------------------------------|
|
107 |
| Dataset 0 | 🌆 Denver | 1354 | [DeepMIMO City Scenario 18](https://www.deepmimo.net/scenarios/deepmimo-city-scenario18/) |
|
108 |
| Dataset 1 | 🏙️ Indianapolis | 3248 | [DeepMIMO City Scenario 15](https://www.deepmimo.net/scenarios/deepmimo-city-scenario15/) |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
109 |
|
110 |
```python
|
111 |
-
#
|
112 |
-
|
113 |
-
|
114 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
115 |
```
|
116 |
|
117 |
---
|
118 |
|
119 |
-
### 6. **
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
120 |
|
121 |
-
|
|
|
122 |
|
123 |
-
|
124 |
from input_preprocess import tokenizer
|
125 |
from lwm_model import lwm
|
|
|
126 |
|
127 |
-
|
128 |
-
|
|
|
129 |
|
130 |
-
# Load LWM model
|
131 |
device = 'cuda' if torch.cuda.is_available() else 'cpu'
|
132 |
print(f"Loading the LWM model on {device}...")
|
133 |
model = lwm.from_pretrained(device=device)
|
@@ -135,16 +226,11 @@ model = lwm.from_pretrained(device=device)
|
|
135 |
|
136 |
---
|
137 |
|
138 |
-
###
|
139 |
-
|
140 |
-
After tokenizing the data and loading the model, you're ready to perform inference with LWM.
|
141 |
-
|
142 |
```python
|
143 |
from inference import lwm_inference, create_raw_dataset
|
144 |
-
|
145 |
input_types = ['cls_emb', 'channel_emb', 'raw']
|
146 |
-
selected_input_type =
|
147 |
-
|
148 |
if selected_input_type in ['cls_emb', 'channel_emb']:
|
149 |
dataset = lwm_inference(preprocessed_chs, selected_input_type, model, device)
|
150 |
else:
|
@@ -153,11 +239,11 @@ else:
|
|
153 |
|
154 |
---
|
155 |
|
156 |
-
###
|
157 |
|
158 |
-
|
|
|
159 |
|
160 |
-
|
161 |
-
|
162 |
-
By using **LWM**, you can enhance your wireless communication systems by utilizing data-driven models that outperform traditional approaches, offering faster and more accurate results.
|
163 |
|
|
|
|
1 |
+
Absolutely! Below is the revised **LWM: Large Wireless Model** setup guide, with added instructions on installation and making it visually appealing, while retaining all original information.
|
|
|
|
|
2 |
|
3 |
+
---
|
4 |
|
5 |
+
# 📡 **LWM: Large Wireless Model**
|
6 |
|
7 |
+
**[🚀 Click here to try the Interactive Demo!](https://huggingface.co/spaces/sadjadalikhani/LWM-Interactive-Demo)**
|
8 |
|
9 |
+
Welcome to the **LWM** (Large Wireless Model) repository! This project hosts a pre-trained model designed to process and extract features from wireless communication datasets, specifically the **DeepMIMO** dataset. Follow the instructions below to set up your environment, install the required packages, clone the repository, load the data, and perform inference with LWM.
|
|
|
|
|
|
|
|
|
10 |
|
11 |
---
|
12 |
|
13 |
## 🛠 **How to Use**
|
14 |
|
15 |
+
### 1. **Install Conda or Mamba (via Miniforge)**
|
16 |
+
|
17 |
+
First, you need to have a package manager like **Conda** or **Mamba** (a faster alternative) installed to manage your Python environments and packages.
|
18 |
+
|
19 |
+
#### **Option A: Install Conda**
|
20 |
+
|
21 |
+
If you prefer to use **Conda**, you can download and install **Anaconda** or **Miniconda**.
|
22 |
+
|
23 |
+
- **Anaconda** includes a full scientific package suite, but it is larger in size. Download it [here](https://www.anaconda.com/products/distribution).
|
24 |
+
- **Miniconda** is a lightweight version that only includes Conda and Python. Download it [here](https://docs.conda.io/en/latest/miniconda.html).
|
25 |
+
|
26 |
+
#### **Option B: Install Mamba (via Miniforge)**
|
27 |
+
|
28 |
+
**Mamba** is a much faster alternative to Conda. You can install **Mamba** by installing **Miniforge**.
|
29 |
|
30 |
+
- **Miniforge** is a smaller, community-based installer for Conda that includes **Mamba**. Download it [here](https://github.com/conda-forge/miniforge/releases/latest).
|
31 |
|
32 |
+
After installation, you can use `conda` or `mamba` for environment management. The commands will be the same except for replacing `conda` with `mamba`.
|
|
|
33 |
|
34 |
---
|
35 |
|
36 |
+
### 2. **Create a New Environment**
|
37 |
+
|
38 |
+
Once you have Conda or Mamba installed, follow these steps to create a new environment and install the necessary packages.
|
39 |
|
40 |
#### **Step 1: Create a new environment**
|
41 |
|
42 |
+
You can create a new environment called `lwm_env` (or any other name) with Python 3.12 or any required version:
|
43 |
|
44 |
```bash
|
45 |
# If you're using Conda:
|
|
|
51 |
|
52 |
#### **Step 2: Activate the environment**
|
53 |
|
54 |
+
Activate the environment you just created:
|
55 |
|
56 |
```bash
|
57 |
+
# For both Conda and Mamba:
|
58 |
conda activate lwm_env
|
59 |
```
|
60 |
|
61 |
---
|
62 |
|
63 |
+
#### **Step 3: Install Required Packages**
|
64 |
|
65 |
+
Install the necessary packages inside your new environment.
|
66 |
|
67 |
```bash
|
68 |
+
# If you're using Conda:
|
69 |
conda install pytorch torchvision torchaudio -c pytorch
|
70 |
+
pip install -r requirements.txt
|
71 |
|
72 |
+
# If you're using Mamba:
|
73 |
+
mamba install pytorch torchvision torchaudio -c pytorch
|
74 |
pip install -r requirements.txt
|
75 |
```
|
76 |
|
77 |
+
> **Note:** The package requirements for the project are as follows:
|
78 |
|
|
|
79 |
```python
|
80 |
import torch
|
81 |
+
import torch.nn as nn
|
82 |
+
import torch.nn.functional as F
|
83 |
import numpy as np
|
84 |
import pandas as pd
|
85 |
import DeepMIMOv3
|
|
|
90 |
from tqdm import tqdm
|
91 |
from datetime import datetime
|
92 |
from torch.utils.data import Dataset, DataLoader
|
93 |
+
import matplotlib.pyplot as plt
|
94 |
+
import time
|
95 |
```
|
96 |
|
97 |
---
|
98 |
|
99 |
+
### 3. **Required Functions to Clone Datasets**
|
100 |
+
|
101 |
+
The following functions will help you clone specific dataset scenarios:
|
102 |
|
103 |
```python
|
104 |
+
import subprocess
|
105 |
+
import os
|
106 |
+
|
107 |
+
# Function to clone a specific dataset scenario folder
|
108 |
+
def clone_dataset_scenario(scenario_name, repo_url, model_repo_dir="./LWM", scenarios_dir="scenarios"):
|
109 |
+
# Create the scenarios directory if it doesn't exist
|
110 |
+
scenarios_path = os.path.join(model_repo_dir, scenarios_dir)
|
111 |
+
if not os.path.exists(scenarios_path):
|
112 |
+
os.makedirs(scenarios_path)
|
113 |
+
|
114 |
+
scenario_path = os.path.join(scenarios_path, scenario_name)
|
115 |
+
|
116 |
+
# Initialize sparse checkout for the dataset repository
|
117 |
+
if not os.path.exists(os.path.join(scenarios_path, ".git")):
|
118 |
+
print(f"Initializing sparse checkout in {scenarios_path}...")
|
119 |
+
subprocess.run(["git", "clone", "--sparse", repo_url, "."], cwd=scenarios_path, check=True)
|
120 |
+
subprocess.run(["git", "sparse-checkout", "init", "--cone"], cwd=scenarios_path, check=True)
|
121 |
+
subprocess.run(["git", "lfs", "install"], cwd=scenarios_path, check=True) # Install Git LFS if needed
|
122 |
+
|
123 |
+
# Add the requested scenario folder to sparse checkout
|
124 |
+
print(f"Adding {scenario_name} to sparse checkout...")
|
125 |
+
subprocess.run(["git", "sparse-checkout", "add", scenario_name], cwd=scenarios_path, check=True)
|
126 |
+
|
127 |
+
# Pull large files if needed (using Git LFS)
|
128 |
+
subprocess.run(["git", "lfs", "pull"], cwd=scenarios_path, check=True)
|
129 |
+
|
130 |
+
print(f"Successfully cloned {scenario_name} into {scenarios_path}.")
|
131 |
+
|
132 |
+
# Function to clone multiple dataset scenarios
|
133 |
+
def clone_dataset_scenarios(selected_scenario_names, dataset_repo_url, model_repo_dir):
|
134 |
+
for scenario_name in selected_scenario_names:
|
135 |
+
clone_dataset_scenario(scenario_name, dataset_repo_url, model_repo_dir)
|
136 |
+
```
|
137 |
+
|
138 |
+
---
|
139 |
+
|
140 |
+
### 4. **Clone the Model**
|
141 |
+
|
142 |
+
Next, you need to clone the **LWM** model from its Git repository. This will download all the necessary files to your local system.
|
143 |
+
|
144 |
+
```bash
|
145 |
+
|
146 |
# Step 1: Clone the model repository (if not already cloned)
|
147 |
model_repo_url = "https://huggingface.co/sadjadalikhani/lwm"
|
148 |
model_repo_dir = "./LWM"
|
|
|
156 |
|
157 |
### 5. **Clone the Desired Datasets**
|
158 |
|
159 |
+
Before proceeding with tokenization and data processing, the **DeepMIMO** dataset—or any dataset generated using the operational settings outlined below—must first be loaded. The table below provides a list of available datasets and their respective links for further details:
|
160 |
|
161 |
📊 **Dataset Overview**
|
162 |
|
|
|
164 |
|----------------|----------------------|------------------------|------------------------------------------------------------------------------------------------------------|
|
165 |
| Dataset 0 | 🌆 Denver | 1354 | [DeepMIMO City Scenario 18](https://www.deepmimo.net/scenarios/deepmimo-city-scenario18/) |
|
166 |
| Dataset 1 | 🏙️ Indianapolis | 3248 | [DeepMIMO City Scenario 15](https://www.deepmimo.net/scenarios/deepmimo-city-scenario15/) |
|
167 |
+
| Dataset 2 | 🌇 Oklahoma | 3455 | [DeepMIMO City Scenario 19](https://www.deepmimo.net/scenarios/deepmimo-city-scenario19/) |
|
168 |
+
| Dataset 3 | 🌆 Fort Worth | 1902 | [DeepMIMO City Scenario 12](https://www.deepmimo.net/scenarios/deepmimo-city-scenario12/) |
|
169 |
+
| Dataset 4 | 🌉 Santa Clara | 2689 | [DeepMIMO City Scenario 11](https://www.deepmimo.net/scenarios/deepmimo-city-scenario11/) |
|
170 |
+
| Dataset 5 | 🌅 San Diego | 2192 | [DeepMIMO City Scenario 7](https://www.deepmimo.net/scenarios/deepmimo-city-scenario7/) |
|
171 |
+
|
172 |
+
It is important to note that these six datasets were **not** used during the pre-training of the LWM model, and the high-quality embeddings produced are a testament to LWM’s robust generalization capabilities rather than overfitting.
|
173 |
+
|
174 |
+
#### **Operational Settings**:
|
175 |
+
- **Antennas at BS**: 32
|
176 |
+
- **Antennas at UEs**: 1
|
177 |
+
- **Subcarriers**: 32
|
178 |
+
- **Paths**: 20
|
179 |
|
180 |
```python
|
181 |
+
# Step 2: Clone specific dataset scenario folder(s) inside the "scenarios" folder
|
182 |
+
dataset_repo_url = "https://huggingface.co/datasets/sadjadalikhani/lwm" # Base URL for dataset repo
|
183 |
+
scenario_names = np.array(["city_18_denver",
|
184 |
+
"city_15_indianapolis",
|
185 |
+
"city_19_oklahoma",
|
186 |
+
"city_12_fortworth",
|
187 |
+
"city_11_santaclara",
|
188 |
+
"city_7_sandiego"]
|
189 |
+
)
|
190 |
+
scenario_idxs = np.array([3])
|
191 |
+
selected_scenario_names = scenario_names[scenario_idxs]
|
192 |
+
|
193 |
+
# Clone the requested scenario folders (this will clone every time)
|
194 |
+
clone_dataset_scenarios(selected_scenario_names, dataset_repo_url, model_repo_dir)
|
195 |
```
|
196 |
|
197 |
---
|
198 |
|
199 |
+
### 6. **Change the working directory to LWM folder**
|
200 |
+
```bash
|
201 |
+
if os.path.exists(model_repo_dir):
|
202 |
+
os.chdir(model_repo_dir)
|
203 |
+
print(f"Changed working directory to {os.getcwd()}")
|
204 |
+
else:
|
205 |
+
print(f"Directory {model_repo_dir} does not exist. Please check if the repository is cloned properly.")
|
206 |
+
```
|
207 |
+
|
208 |
+
---
|
209 |
|
210 |
+
### 7. **Tokenize and Load the Model**
|
211 |
+
```
|
212 |
|
213 |
+
python
|
214 |
from input_preprocess import tokenizer
|
215 |
from lwm_model import lwm
|
216 |
+
import torch
|
217 |
|
218 |
+
preprocessed_chs = tokenizer(selected_scenario_names=selected_scenario_names,
|
219 |
+
manual_data=None,
|
220 |
+
gen_raw=True)
|
221 |
|
|
|
222 |
device = 'cuda' if torch.cuda.is_available() else 'cpu'
|
223 |
print(f"Loading the LWM model on {device}...")
|
224 |
model = lwm.from_pretrained(device=device)
|
|
|
226 |
|
227 |
---
|
228 |
|
229 |
+
### 8. **Perform Inference**
|
|
|
|
|
|
|
230 |
```python
|
231 |
from inference import lwm_inference, create_raw_dataset
|
|
|
232 |
input_types = ['cls_emb', 'channel_emb', 'raw']
|
233 |
+
selected_input_type = input_types[0]
|
|
|
234 |
if selected_input_type in ['cls_emb', 'channel_emb']:
|
235 |
dataset = lwm_inference(preprocessed_chs, selected_input_type, model, device)
|
236 |
else:
|
|
|
239 |
|
240 |
---
|
241 |
|
242 |
+
### 9. **Explore the Interactive Demo**
|
243 |
|
244 |
+
If you'd like to explore **LWM** interactively, check out the demo hosted on Hugging Face Spaces:
|
245 |
+
[**Try the Interactive Demo!**](https://huggingface.co/spaces/sadjadalikhani/LWM-Interactive-Demo)
|
246 |
|
247 |
+
---
|
|
|
|
|
248 |
|
249 |
+
Now you’re ready to dive into the world of **Large Wireless Model (LWM)**, process wireless communication datasets, and extract high-quality embeddings to fuel your research or application!
|