File size: 4,101 Bytes
4ca706f
 
c1e08a0
 
 
 
4ca706f
 
c1e08a0
4ca706f
c1e08a0
4ca706f
c1e08a0
4ca706f
c1e08a0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
---
title: Improvisation Lab
emoji: 🎵
python_version: 3.11
colorFrom: blue
colorTo: purple
sdk: gradio
sdk_version: 5.7.1
app_file: main.py
pinned: false
license: mit
---
# Improvisation Lab

A Python package for generating musical improvisation melodies based on music theory principles. The package specializes in creating natural-sounding melodic phrases that follow chord progressions while respecting musical rules, with real-time pitch detection for practice feedback.

Improvisation Lab Demo

https://github.com/user-attachments/assets/a4207f7e-166c-4f50-9c19-5bf5269fd04e


## Features

- Generate melodic phrases based on scales and chord progressions
- Support for multiple scale types:
  - Major
  - Natural minor
  - Harmonic minor
  - Diminished
- Support for various chord types:
  - Major 7th (maj7)
  - Minor 7th (min7)
  - Dominant 7th (dom7)
  - Half-diminished (min7b5)
  - Diminished 7th (dim7)
- Intelligent note selection based on:
  - Chord tones vs non-chord tones
  - Scale degrees
  - Previous note context
- Real-time pitch detection with FCPE (Fast Context-aware Pitch Estimation)
- Web-based and direct microphone input support

## Prerequisites

- Python 3.11 or higher
- A working microphone
- [Poetry](https://python-poetry.org/) for dependency management

## Installation
```bash
make install
```

## Quick Start
1. Create your configuration file:

```bash
cp config.yml.example config.yml
```

2.  (Optional) Edit `config.yml` to customize settings like audio parameters and song selection

3. Run the script to start the melody generation and playback (default is web interface):

```bash
make run
```

- To run the console interface, use:

```bash
poetry run python main.py --app_type console
```

4. Follow the displayed melody phrases and sing along with real-time feedback

### Configuration

The application can be customized through `config.yml` with the following options:

#### Audio Settings
- `sample_rate`: Audio sampling rate (default: 44100 Hz)
- `buffer_duration`: Duration of audio processing buffer (default: 0.2 seconds)
- `note_duration`: How long to display each note during practice (default: 3 seconds)
- `pitch_detector`: Configuration for the pitch detection algorithm
  - `hop_length`: Hop length for the pitch detection algorithm (default: 512)
  - `threshold`: Threshold for the pitch detection algorithm (default: 0.006)
  - `f0_min`: Minimum frequency for the pitch detection algorithm (default: 80 Hz)
  - `f0_max`: Maximum frequency for the pitch detection algorithm (default: 880 Hz)
  - `device`: Device to use for the pitch detection algorithm (default: "cpu")

#### Song Selection
- `selected_song`: Name of the song to practice
- `chord_progressions`: Dictionary of songs and their progressions
  - Format: `[scale_root, scale_type, chord_root, chord_type, duration]`
  - Example:
    ```yaml
    fly_me_to_the_moon:
      - ["A", "natural_minor", "A", "min7", 4]
      - ["A", "natural_minor", "D", "min7", 4]
      - ["C", "major", "G", "dom7", 4]
    ```


## How It Works

### Melody Generation
The melody generation follows these principles:
1. Notes are selected based on their relationship to the current chord and scale
2. Chord tones have more freedom in movement
3. Non-chord tones are restricted to moving to adjacent scale notes
4. Phrases are connected naturally by considering the previous note
5. All generated notes stay within the specified scale

### Real-time Feedback
Pitch Detection Demo:

https://github.com/user-attachments/assets/fd9e6e3f-85f1-42be-a6c8-b757da478854

The application provides real-time feedback by:
1. Capturing audio from your microphone
2. Detecting the pitch using FCPE (Fast Context-aware Pitch Estimation)
3. Converting the frequency to the nearest musical note
4. Displaying both the target note and your sung note in real-time

## Development
### Running Lint
```bash
make lint
```

### Running Format
```bash
make format
```

### Running Tests
```bash
make test
```

## License

MIT License

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request.