Dataset Viewer
repo
stringlengths 8
43
| pull_number
int64 2
2.24k
| instance_id
stringlengths 13
47
| issue_numbers
stringlengths 5
14
| base_commit
stringlengths 40
40
| patch
stringlengths 395
88.1M
| test_patch
stringlengths 362
211k
| problem_statement
stringlengths 12
9.74k
| hints_text
stringlengths 0
4.33k
| created_at
timestamp[us]date 2013-06-20 21:02:34
2025-04-08 12:39:20
|
---|---|---|---|---|---|---|---|---|---|
olirice/flupy | 31 | olirice__flupy-31 | ['30'] | 326e5635ed02012ac5a9f8909b0716154588aa40 | diff --git a/README.md b/README.md
index 90bc4d8..3577baf 100644
--- a/README.md
+++ b/README.md
@@ -2,7 +2,6 @@
<p>
-<a href="https://github.com/olirice/flupy/actions"><img src="https://github.com/olirice/flupy/workflows/Tests/badge.svg" alt="Tests" height="18"></a>
<a href="https://flupy.readthedocs.io/en/latest/?badge=latest"><img src="https://readthedocs.org/projects/flupy/badge/?version=latest" alt="Tests" height="18"></a>
<a href="https://codecov.io/gh/olirice/flupy"><img src="https://codecov.io/gh/olirice/flupy/branch/master/graph/badge.svg" height="18"></a>
<a href="https://github.com/psf/black">
diff --git a/src/flupy/fluent.py b/src/flupy/fluent.py
index 96257eb..24b32a1 100644
--- a/src/flupy/fluent.py
+++ b/src/flupy/fluent.py
@@ -6,12 +6,9 @@
from itertools import dropwhile, groupby, islice, product, takewhile, tee, zip_longest
from random import sample
from typing import (
- TYPE_CHECKING,
Any,
Callable,
Collection,
- Container,
- ContextManager,
Deque,
Generator,
Generic,
@@ -20,7 +17,6 @@
Iterator,
List,
Optional,
- Sequence,
Set,
Tuple,
Type,
@@ -546,7 +542,7 @@ def zip(
"Fluent[Tuple[T, _T1, _T2, _T3]]",
]:
"""Yields tuples containing the i-th element from the i-th
- argument in the chainable, and the iterable
+ argument in the instance, and the iterable
>>> flu(range(5)).zip(range(3, 0, -1)).to_list()
[(0, 3), (1, 2), (2, 1)]
@@ -558,7 +554,7 @@ def zip(
def zip_longest(self, *iterable: Iterable[_T1], fill_value: Any = None) -> "Fluent[Tuple[T, ...]]":
"""Yields tuples containing the i-th element from the i-th
- argument in the chainable, and the iterable
+ argument in the instance, and the iterable
Iteration continues until the longest iterable is exhaused.
If iterables are uneven in length, missing values are filled in with fill value
@@ -572,11 +568,11 @@ def zip_longest(self, *iterable: Iterable[_T1], fill_value: Any = None) -> "Flue
return Fluent(zip_longest(self, *iterable, fillvalue=fill_value))
def enumerate(self, start: int = 0) -> "Fluent[Tuple[int, T]]":
- """Yields tuples from the chainable where the first element
+ """Yields tuples from the instance where the first element
is a count from initial value *start*.
- >>> flu(range(5)).zip_longest(range(3, 0, -1)).to_list()
- [(0, 3), (1, 2), (2, 1), (3, None), (4, None)]
+ >>> flu([3,4,5]).enumerate().to_list()
+ [(0, 3), (1, 4), (2, 5)]
"""
return Fluent(enumerate(self, start=start))
| diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
index a3fb32b..ca6858e 100644
--- a/.github/workflows/test.yml
+++ b/.github/workflows/test.yml
@@ -8,7 +8,7 @@ jobs:
strategy:
matrix:
- python-version: ['3.6', '3.7', '3.8', '3.9']
+ python-version: ['3.7', '3.8', '3.9', '3.10', '3.11']
steps:
| Doc string for enumerate
Doc string for enumerate shows "zip_longest"
| 2023-04-25T16:21:26 |
|
Colin-b/httpx_auth | 103 | Colin-b__httpx_auth-103 | ['102'] | 24a1cf2d1978b8d813be0cda05ea856e801bb4f2 | diff --git a/.github/workflows/release.yml b/.github/workflows/release.yml
index 94b6004..5338b25 100644
--- a/.github/workflows/release.yml
+++ b/.github/workflows/release.yml
@@ -15,7 +15,7 @@ jobs:
- name: Set up Python
uses: actions/setup-python@v5
with:
- python-version: '3.12'
+ python-version: '3.13'
- name: Create packages
run: |
python -m pip install build
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 67a100a..3541ca5 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -5,6 +5,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [Unreleased]
+### Changed
+- Requires [`httpx`](https://www.python-httpx.org)==0.28.\*
+
+### Added
+- Explicit support for python `3.13`.
## [0.22.0] - 2024-03-02
### Changed
diff --git a/README.md b/README.md
index da0bd5a..8eac8e2 100644
--- a/README.md
+++ b/README.md
@@ -377,7 +377,7 @@ Note:
| `early_expiry` | Number of seconds before actual token expiry where token will be considered as expired. Used to ensure token will not expire between the time of retrieval and the time the request reaches the actual server. Set it to 0 to deactivate this feature and use the same token until actual expiry. | Optional | 30.0 |
| `client` | `httpx.Client` instance that will be used to request the token. Use it to provide a custom proxying rule for instance. | Optional | |
-Any other parameter will be put as body parameters in the token URL.
+Any other parameter will be put as body parameters in the token URL.
### Client Credentials flow
diff --git a/pyproject.toml b/pyproject.toml
index 5c7a507..9f68ced 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -27,10 +27,11 @@ classifiers=[
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
+ "Programming Language :: Python :: 3.13",
"Topic :: Software Development :: Build Tools",
]
dependencies = [
- "httpx==0.27.*",
+ "httpx==0.28.*",
]
dynamic = ["version"]
@@ -45,13 +46,13 @@ testing = [
# Used to generate test tokens
"pyjwt==2.*",
# Used to mock httpx
- "pytest_httpx==0.32.*",
+ "pytest_httpx==0.35.*",
# Used to mock date and time
"time-machine==2.*",
# Used to check coverage
- "pytest-cov==5.*",
+ "pytest-cov==6.*",
# Used to run async tests
- "pytest-asyncio==0.24.*",
+ "pytest-asyncio==0.25.*",
]
[tool.setuptools.dynamic]
| diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
index 6eedec4..098c92a 100644
--- a/.github/workflows/test.yml
+++ b/.github/workflows/test.yml
@@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
- python-version: ['3.9', '3.10', '3.11', '3.12', '3.13.0-rc.2']
+ python-version: ['3.9', '3.10', '3.11', '3.12', '3.13']
steps:
- uses: actions/checkout@v4
diff --git a/tests/aws_signature_v4/test_aws4auth_async.py b/tests/aws_signature_v4/test_aws4auth_async.py
index 2868d7b..99eb9bb 100644
--- a/tests/aws_signature_v4/test_aws4auth_async.py
+++ b/tests/aws_signature_v4/test_aws4auth_async.py
@@ -47,8 +47,8 @@ async def test_aws_auth_with_content_in_request(httpx_mock: HTTPXMock):
method="POST",
match_json=[{"key": "value"}],
match_headers={
- "x-amz-content-sha256": "fb65c1441d6743274738fe3b3042a73167ba1fb2d34679d8dd16433473758f97",
- "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=5f4f832a19fc834d4f34047289ad67d96da25bd414a70f02ce6b85aef9ab8068",
+ "x-amz-content-sha256": "1e1d3e3fb0bcfb7b2b61f687369d0227e6aefd6739e1182312382ab03e83b75f",
+ "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=680fe73ca28e1639a3b2337a68d83324e03742679e612a52d3d29c9b6fc4b512",
"x-amz-date": "20181011T150505Z",
},
)
@@ -470,8 +470,8 @@ async def test_aws_auth_with_security_token_and_content_in_request(
method="POST",
match_json=[{"key": "value"}],
match_headers={
- "x-amz-content-sha256": "fb65c1441d6743274738fe3b3042a73167ba1fb2d34679d8dd16433473758f97",
- "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token, Signature=e02c4733589cf6e80361f6905564da6d0c23a0829bb3c3899b328e43b2f7b581",
+ "x-amz-content-sha256": "1e1d3e3fb0bcfb7b2b61f687369d0227e6aefd6739e1182312382ab03e83b75f",
+ "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token, Signature=838d461dd62852877565b9f91558a9da26d7af50d8fadf3c48cc1a9f6d3561f4",
"x-amz-date": "20181011T150505Z",
"x-amz-security-token": "security_token",
},
diff --git a/tests/aws_signature_v4/test_aws4auth_sync.py b/tests/aws_signature_v4/test_aws4auth_sync.py
index e59930c..e3e5e7c 100644
--- a/tests/aws_signature_v4/test_aws4auth_sync.py
+++ b/tests/aws_signature_v4/test_aws4auth_sync.py
@@ -45,8 +45,8 @@ def test_aws_auth_with_content_in_request(httpx_mock: HTTPXMock):
method="POST",
match_json=[{"key": "value"}],
match_headers={
- "x-amz-content-sha256": "fb65c1441d6743274738fe3b3042a73167ba1fb2d34679d8dd16433473758f97",
- "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=5f4f832a19fc834d4f34047289ad67d96da25bd414a70f02ce6b85aef9ab8068",
+ "x-amz-content-sha256": "1e1d3e3fb0bcfb7b2b61f687369d0227e6aefd6739e1182312382ab03e83b75f",
+ "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date, Signature=680fe73ca28e1639a3b2337a68d83324e03742679e612a52d3d29c9b6fc4b512",
"x-amz-date": "20181011T150505Z",
},
)
@@ -454,8 +454,8 @@ def test_aws_auth_with_security_token_and_content_in_request(httpx_mock: HTTPXMo
method="POST",
match_json=[{"key": "value"}],
match_headers={
- "x-amz-content-sha256": "fb65c1441d6743274738fe3b3042a73167ba1fb2d34679d8dd16433473758f97",
- "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token, Signature=e02c4733589cf6e80361f6905564da6d0c23a0829bb3c3899b328e43b2f7b581",
+ "x-amz-content-sha256": "1e1d3e3fb0bcfb7b2b61f687369d0227e6aefd6739e1182312382ab03e83b75f",
+ "Authorization": "AWS4-HMAC-SHA256 Credential=access_id/20181011/us-east-1/iam/aws4_request, SignedHeaders=content-type;host;x-amz-content-sha256;x-amz-date;x-amz-security-token, Signature=838d461dd62852877565b9f91558a9da26d7af50d8fadf3c48cc1a9f6d3561f4",
"x-amz-date": "20181011T150505Z",
"x-amz-security-token": "security_token",
},
| update to allow using httpx 0.28
Current version has 0.27.* required, httpx 0.28 comes with built in socks5h support I'd like to use.
| 2025-01-07T09:25:12 |
|
Jakeler/ble-serial | 60 | Jakeler__ble-serial-60 | ['96'] | 583866bd62a9b1d35e7a0392d6ccb16dcd19807b | diff --git a/README.md b/README.md
index 6795fc3..7f6a341 100644
--- a/README.md
+++ b/README.md
@@ -14,7 +14,7 @@ $ pip install ble-serial
Now you should have 2 new scripts: `ble-scan` and the main `ble-serial`.
-On Linux/Mac you are ready now and can directly jump to the usage section!
+On Linux/Mac you are ready now and can directly jump to the [usage](#usage) section!
For Windows follow the [additional steps below](#additional-steps-for-windows).
### From source (for developers)
@@ -184,7 +184,8 @@ As you can see, here the read/notify UUID is `00000006-af0e-4c28-95a4-4509fd91e0
The `ble-serial` tool itself has a few more options:
```console
$ ble_serial -h
-usage: __main__.py [-h] [-v] [-t SEC] [-i ADAPTER] [-m MTU] [-d DEVICE] [-a {public,random}] [-s SERVICE_UUID] [-w WRITE_UUID] [-r READ_UUID] [--permit {ro,rw,wo}] [-l FILENAME] [-b] [-p PORT] [--expose-tcp-host TCP_HOST] [--expose-tcp-port TCP_PORT]
+usage: ble-serial [-h] [-v] [-t SEC] [-i ADAPTER] [-m MTU] [-g {server,client}] [-n GAP_NAME] [-d DEVICE] [-a {public,random}] [-s SERVICE_UUID] [-r READ_UUID] [-w WRITE_UUID] [--permit {ro,rw,wo}] [--write-with-response]
+ [-l FILENAME] [-b] [-p PORT] [--expose-tcp-host TCP_HOST] [--expose-tcp-port TCP_PORT]
Create virtual serial ports from BLE devices.
@@ -200,19 +201,23 @@ connection parameters:
-m MTU, --mtu MTU Max. bluetooth packet data size in bytes used for sending (default: 20)
device parameters:
+ -g {server,client}, --role {server,client}
+ Operate as BLE role: client (BLE central), server (BLE peripheral) (default: client)
+ -n GAP_NAME, --name GAP_NAME
+ Custom display name in BLE server mode, uses "BLE Serial Server {PID}" otherwise. Prefix for logs lines in all modes. (default: None)
-d DEVICE, --dev DEVICE
BLE device address to connect (hex format, can be separated by colons) (default: None)
-a {public,random}, --address-type {public,random}
BLE address type, only relevant on Windows, ignored otherwise (default: public)
-s SERVICE_UUID, --service-uuid SERVICE_UUID
- The service used for scanning of potential devices (default: None)
- -w WRITE_UUID, --write-uuid WRITE_UUID
- The GATT characteristic to write the serial data, you might use "ble-scan -d" to find it out (default: None)
+ In "client" mode - service UUID used for scanning of potential devices. In "server" mode - service UUID used to provide read/write GATT characteristics. (default: None)
-r READ_UUID, --read-uuid READ_UUID
- The GATT characteristic to subscribe to notifications to read the serial data (default: None)
+ The GATT characteristic to subscribe to notifications to read the serial data. If omitted, will be auto generated based on service UUID (default: None)
+ -w WRITE_UUID, --write-uuid WRITE_UUID
+ The GATT characteristic to write the serial data, you might use "ble-scan -d" to find it out. If omitted, will be auto generated based on service UUID (default: None)
--permit {ro,rw,wo} Restrict transfer direction on bluetooth: read only (ro), read+write (rw), write only (wo) (default: rw)
--write-with-response
- Wait for a response from the remote device before sending more. Better data integrity, higher latency and less througput (default: False)
+ Wait for a response from the remote device before sending more. Better data integrity, higher latency and less throughput (default: False)
```
In any case it needs to know which device to connect, the simple and most reliable way to specify this is by device address/id:
@@ -291,6 +296,51 @@ $ cat demo.txt
Per default it is transformed to hex bytes, use `-b`/`--binary` to log raw data, useful if your input is already ASCII etc.
## Advanced Usage
+### Bluetooth server
+Per default ble-serial operates in client (ble central) role and can connect to typical modules (ble peripheral) which define services and advertise itself.
+Since version 3.0 it's possible to swap these roles with `-g {server,client}`/`--role {server,client}`.
+
+#### Prerequisites
+Install extra dependencies with:
+```console
+$ pip install ble-serial[server]
+```
+and on Windows additionally (one not on pypi):
+```console
+$ pip install https://github.com/gwangyi/pysetupdi/archive/refs/heads/master.zip
+```
+
+#### Config and startup
+No external device argument is required in this mode, but you have to define the service and characteristics.
+```console
+$ ble-serial -g server -s 6e400001-b5a3-f393-e0a9-e50e24dcca9e
+17:02:23.860 | INFO | linux_pty.py: Port endpoint created on /tmp/ttyBLE -> /dev/pts/6
+17:02:23.860 | INFO | ble_server.py: Name/ID: BLE Serial Server 11296
+17:02:23.860 | INFO | ble_server.py: Listener set up
+17:02:23.860 | WARNING | uuid_helpers.py: No write uuid specified, derived from service 6e400001-b5a3-f393-e0a9-e50e24dcca9e -> 6e400002-b5a3-f393-e0a9-e50e24dcca9e
+17:02:23.860 | WARNING | uuid_helpers.py: No read uuid specified, derived from service 6e400001-b5a3-f393-e0a9-e50e24dcca9e -> 6e400003-b5a3-f393-e0a9-e50e24dcca9e
+17:02:23.864 | INFO | ble_server.py: Service 6e400001-b5a3-f393-e0a9-e50e24dcca9e
+17:02:23.864 | INFO | ble_server.py: Write characteristic: 6e400002-b5a3-f393-e0a9-e50e24dcca9e: Nordic UART RX
+17:02:23.864 | INFO | ble_server.py: Read characteristic: 6e400003-b5a3-f393-e0a9-e50e24dcca9e: Nordic UART TX
+17:02:23.893 | INFO | ble_server.py: Server startup successful
+17:02:23.893 | INFO | main.py: Running main loop!
+```
+It automatically derives all characteristics when only the service uuid is specified, the run above is equivalent to `-w 6e400002-b5a3-f393-e0a9-e50e24dcca9e` `-r 6e400003-b5a3-f393-e0a9-e50e24dcca9e`. Also the other arguments (name, mtu, logs, ports, tcp, etc.) are supported.
+
+#### Connection
+Works like with any other server via the device address. Note that mac address might change in every session depending on OS/platform and currently it's not possible to set or display this value.
+
+Using ble-serial as client on the other machine can solve this with service based selection:
+
+```console
+$ ble-serial -g client -s 6e400001-b5a3-f393-e0a9-e50e24dcca9e
+...
+17:27:26.069 | WARNING | ble_client.py: Picking first device with matching service, consider passing a specific device address, especially if there could be multiple devices
+17:27:27.685 | INFO | ble_client.py: Trying to connect with [MAC]: BLE Serial Server 11296
+```
+
+With a custom service it's also highly unlikely to accidentally connect a wrong device. So maybe use something else than the standard Nordic UART service and the warning can be ignored.
+
### TCP socket server
Instead of the serial port emulation there is a also builtin raw tcp server since version 2.7:
```
@@ -381,16 +431,36 @@ usage: ble-autoconnect.py [-h] [-c CONFIG] [-v]
Service to automatically connect with devices that get available.
-optional arguments:
+options:
-h, --help show this help message and exit
-c CONFIG, --config CONFIG
Path to a INI file with device configs (default: autoconnect.ini)
-v, --verbose Increase log level from info to debug (default: False)
+ -m MIN_RSSI, --min-rssi MIN_RSSI
+ Ignore devices with weaker signal strength (default: -127)
+ -t TIMEOUT, --timeout TIMEOUT
+ Pause scan for seconds amount to let ble-serial start up (default: 10)
+```
+This continuously scans for devices and compares them with the configuration,
+then automatically starts up `ble-serial` (or other tools) if a known device is detected.
+Brings similar convenience as USB adapters, just turn the BLE device on and the serial port shows up on the PC.
+See the example `autoconnect.ini` for configuration.
+Starting with version 3.0 it can connect to multiple devices in parallel, make sure there are no port conflicts described [above](#multi-device-connection).
+Output from the managing script and all instances are printed to the same terminal in this case. It's possible to add a instance specific prefix to each log line
+with `--name GAP_NAME` and in .ini `name = your-name-here`.
+
+Example launch:
+```console
+[AUTOCONNECT] 2024-12-09 16:35:02,922 | INFO | 20:91:48:4C:4C:54 = UT61E - JK (RSSI: -76) Services=['0000ffe0-0000-1000-8000-00805f9b34fb', '0000b000-0000-1000-8000-00805f9b34fb']
+[AUTOCONNECT] 2024-12-09 16:35:02,922 | INFO | Found 20:91:48:4C:4C:54 in config!
+[AUTOCONNECT] 2024-12-09 16:35:02,922 | INFO | ['ble-serial', '--dev', '20:91:48:4C:4C:54', '--address-type', 'public', '--port', '/tmp/UT61E', '--name', 'your-name-here', '--timeout', '10', '--mtu', '20']
+[your-name-here] 16:35:02.993 | INFO | linux_pty.py: Port endpoint created on /tmp/UT61E -> /dev/pts/6
+[your-name-here] 16:35:02.993 | INFO | ble_client.py: Receiver set up
+[your-name-here] 16:35:04.252 | INFO | ble_client.py: Trying to connect with 20:91:48:4C:4C:54: UT61E - JK
+[your-name-here] 16:35:05.922 | INFO | ble_client.py: Device 20:91:48:4C:4C:54 connected
+...
```
-This continuously scans for devices and compares them with the configuration, it then automatically starts up `ble-serial` (or other tools) if a known device is detected.
-This should bring similar convenience like USB adapters, just turn the BLE device on and
-the serial port shows up on the PC. See the example `autoconnect.ini` for configuration.
On Linux you can also use the included systemd (user) service to auto start this on boot.
diff --git a/ble_serial/bluetooth/ble_interface.py b/ble_serial/bluetooth/ble_client.py
similarity index 89%
rename from ble_serial/bluetooth/ble_interface.py
rename to ble_serial/bluetooth/ble_client.py
index 0b215c5..f4e7b8a 100644
--- a/ble_serial/bluetooth/ble_interface.py
+++ b/ble_serial/bluetooth/ble_client.py
@@ -1,25 +1,27 @@
from bleak import BleakClient, BleakScanner
from bleak.backends.characteristic import BleakGATTCharacteristic
from ble_serial.bluetooth.constants import ble_chars
-import logging
-import asyncio
+from ble_serial.bluetooth.interface import BLE_interface
+import logging, asyncio
from typing import Optional, List
-class BLE_interface():
- def __init__(self, adapter: str, service: str):
+class BLE_client(BLE_interface):
+ def __init__(self, adapter: str, id: str = None):
self._send_queue = asyncio.Queue()
- self.scan_args = dict(adapter=adapter)
- if service:
- self.scan_args['service_uuids'] = [service]
+ self.adapter = adapter
+
+ async def connect(self, addr_str: str, addr_type: str, service_uuid: str, timeout: float):
+ scan_args = dict(adapter=self.adapter)
+ if service_uuid:
+ scan_args['service_uuids'] = [service_uuid]
- async def connect(self, addr_str: str, addr_type: str, timeout: float):
if addr_str:
- device = await BleakScanner.find_device_by_address(addr_str, timeout=timeout, **self.scan_args)
+ device = await BleakScanner.find_device_by_address(addr_str, timeout=timeout, **scan_args)
else:
logging.warning('Picking first device with matching service, '
'consider passing a specific device address, especially if there could be multiple devices')
- device = await BleakScanner.find_device_by_filter(lambda dev, ad: True, timeout=timeout, **self.scan_args)
+ device = await BleakScanner.find_device_by_filter(lambda dev, ad: True, timeout=timeout, **scan_args)
assert device, 'No matching device found!'
@@ -105,6 +107,10 @@ async def send_loop(self):
continue
logging.debug(f'Sending {data}')
await self.dev.write_gatt_char(self.write_char, data, self.write_response_required)
+
+ async def check_loop(self):
+ while True:
+ await asyncio.sleep(1)
def stop_loop(self):
logging.info('Stopping Bluetooth event loop')
diff --git a/ble_serial/bluetooth/ble_server.py b/ble_serial/bluetooth/ble_server.py
new file mode 100644
index 0000000..815ccdb
--- /dev/null
+++ b/ble_serial/bluetooth/ble_server.py
@@ -0,0 +1,131 @@
+from bless import BlessServer, BlessGATTCharacteristic
+from bless import GATTAttributePermissions, GATTCharacteristicProperties
+from ble_serial.bluetooth.interface import BLE_interface
+from ble_serial.bluetooth.uuid_helpers import check_fill_empty
+import os, logging, asyncio
+from typing import Optional
+
+class BLE_server(BLE_interface):
+ def __init__(self, adapter: str, gap_name: str):
+ self._send_queue = asyncio.Queue()
+ self.data_read_done = asyncio.Event()
+
+ # Workaround for bluez not sending constant names, PID always changes,
+ # if custom name has not been provided
+ if gap_name is None:
+ self.local_name = f'BLE Serial Server {os.getpid()}'
+ else:
+ self.local_name = gap_name
+
+ logging.info(f'Name/ID: {self.local_name}')
+
+ self.server = BlessServer(name=self.local_name, adapter=adapter) # loop=asyncio.get_event_loop())
+ self.server.read_request_func = self.handle_incoming_read
+ self.server.write_request_func = self.handle_incoming_write
+ self.connected = False
+
+ async def start(self, timeout: float):
+ # logging.info(f'Trying to start with {addr_str}')
+ #TODO: obtain adapter address
+ success = await self.server.start(timeout=timeout)
+ logging.info(f'Server startup {"failed!" if success == False else "successful"}')
+
+
+ async def setup_chars(self, service_uuid: str, write_uuid: str, read_uuid: str, mode: str, write_response_required: bool):
+ self.read_enabled = 'r' in mode
+ self.write_enabled = 'w' in mode
+
+ self.service_uuid = service_uuid
+ self.write_uuid = check_fill_empty(service_uuid, write_uuid, 'write')
+ self.read_uuid = check_fill_empty(service_uuid, read_uuid, 'read')
+
+ await self.server.add_new_service(self.service_uuid)
+ self.service = self.server.get_service(self.service_uuid)
+ logging.info(f'Service {self.service_uuid}')
+
+ if self.write_enabled:
+ # self.write_uuid = "0000ffe1-0000-1000-8000-00805f9b34fb"
+ char_flags = (
+ GATTCharacteristicProperties.write if write_response_required else
+ GATTCharacteristicProperties.write_without_response)
+ permissions = GATTAttributePermissions.readable | GATTAttributePermissions.writeable
+ await self.server.add_new_characteristic(self.service_uuid, self.write_uuid,
+ char_flags, None, permissions)
+
+ self.write_char = self.server.get_characteristic(self.write_uuid)
+ logging.info(f'Write characteristic: {str(self.write_char)}')
+ else:
+ logging.info('Writing disabled, no characteristic to setup')
+
+ if self.read_enabled:
+ char_flags = GATTCharacteristicProperties.read | GATTCharacteristicProperties.notify
+ permissions = GATTAttributePermissions.readable | GATTAttributePermissions.writeable
+ await self.server.add_new_characteristic(self.service_uuid, self.read_uuid,
+ char_flags, None, permissions)
+
+ self.read_char = self.server.get_characteristic(self.read_uuid)
+ logging.info(f'Read characteristic: {str(self.read_char)}')
+
+ self.data_read_done.set()
+ else:
+ logging.info('Reading disabled, no characteristic to setup')
+
+
+ def handle_incoming_read(self, char: BlessGATTCharacteristic) -> bytearray:
+ logging.debug('Client read data')
+ if self.read_char != char:
+ logging.warning('Read request received on wrong characteristic')
+ return None
+ self.data_read_done.set()
+ return self.read_char.value
+
+ def queue_send(self, data: bytes):
+ self._send_queue.put_nowait(data)
+
+ async def send_loop(self):
+ assert hasattr(self, '_cb'), 'Callback must be set before receive loop!'
+ while True:
+ data = await self._send_queue.get()
+ if data == None:
+ break # Let future end on shutdown
+ if not self.read_enabled:
+ logging.warning(f'Ignoring unexpected read data: {data}')
+ continue
+ logging.debug(f'Offering read {data}')
+ # Wait for current data to get read, then overwrite
+ # await self.data_read_done.wait()
+ self.read_char.value = data
+ # Mark as ready to read
+ self.data_read_done.clear()
+ self.server.update_value(self.service.uuid, self.read_char.uuid)
+ await self.server.stop()
+
+ async def check_loop(self):
+ while True:
+ await asyncio.sleep(1)
+
+ advertising = await self.server.is_advertising()
+ if self.connected != await self.server.is_connected():
+ self.connected = not self.connected
+ logging.info('New BLE client connected' if self.connected else 'BLE client disconnected')
+ logging.debug(f'{advertising=} {self.connected=}')
+
+ def stop_loop(self):
+ logging.info('Stopping Bluetooth event loop')
+ self._send_queue.put_nowait(None)
+
+ async def disconnect(self):
+ if hasattr(self, 'server'):
+ await self.server.stop()
+ logging.info('Bluetooth server stopped')
+
+ def set_receiver(self, callback):
+ self._cb = callback
+ logging.info('Listener set up')
+
+ def handle_incoming_write(self, char: BlessGATTCharacteristic, data: bytes):
+ logging.debug(f'Received write from {char}: {data}')
+ if not self.write_enabled:
+ logging.warning(f'Got unexpected write data, dropping: {data}')
+ return
+ self._cb(data)
diff --git a/ble_serial/bluetooth/interface.py b/ble_serial/bluetooth/interface.py
new file mode 100644
index 0000000..dadd36f
--- /dev/null
+++ b/ble_serial/bluetooth/interface.py
@@ -0,0 +1,26 @@
+from abc import ABC, abstractmethod
+
+class BLE_interface(ABC):
+ @abstractmethod
+ def __init__(self, adapter: str, instance_identifier: str):
+ pass
+
+ @abstractmethod
+ def set_receiver(self, callback):
+ pass
+
+ @abstractmethod
+ def queue_send(self, value: bytes):
+ pass
+
+ @abstractmethod
+ async def send_loop(self):
+ pass
+
+ @abstractmethod
+ def stop_loop(self):
+ pass
+
+ @abstractmethod
+ async def disconnect(self):
+ pass
\ No newline at end of file
diff --git a/ble_serial/bluetooth/uuid_helpers.py b/ble_serial/bluetooth/uuid_helpers.py
new file mode 100644
index 0000000..656225d
--- /dev/null
+++ b/ble_serial/bluetooth/uuid_helpers.py
@@ -0,0 +1,25 @@
+import logging
+from uuid import UUID
+
+server_offset_map = {
+ 'write': 1,
+ 'read': 2,
+}
+
+def compare_node(uuid1: str, uuid2: str) -> bool:
+ return UUID(uuid1).node == UUID(uuid2).node
+
+def check_fill_empty(service_uuid: str, uuid: str, typ: str) -> str:
+ if not uuid:
+ uuid = derive_chars_from_service(service_uuid, server_offset_map[typ])
+ logging.warning(f'No {typ} uuid specified, derived from service {service_uuid} -> {uuid}')
+
+ assert compare_node(service_uuid, uuid), f'Service and {typ} uuid are not from the same family'
+
+ return uuid
+
+def derive_chars_from_service(service_uuid: str, offset: int) -> str:
+ uid = UUID(service_uuid)
+ field_list = list(uid.fields)
+ field_list[0] += offset
+ return str(UUID(fields=field_list))
\ No newline at end of file
diff --git a/ble_serial/cli.py b/ble_serial/cli.py
index a80e7b8..d33e0d1 100644
--- a/ble_serial/cli.py
+++ b/ble_serial/cli.py
@@ -1,7 +1,7 @@
from argparse import ArgumentParser, ArgumentDefaultsHelpFormatter, Namespace
from ble_serial import DEFAULT_PORT, DEFAULT_PORT_MSG
-def parse_args():
+def parse_args() -> Namespace:
parser = ArgumentParser(formatter_class=ArgumentDefaultsHelpFormatter,
description='Create virtual serial ports from BLE devices.')
@@ -17,21 +17,28 @@ def parse_args():
help='Max. bluetooth packet data size in bytes used for sending')
dev_group = parser.add_argument_group('device parameters')
+ dev_group.add_argument('-g', '--role', dest='gap_role', required=False, default='client', choices=['server', 'client'],
+ help='Operate as BLE role: client (BLE central), server (BLE peripheral)')
+ dev_group.add_argument('-n', '--name', dest='gap_name', required=False,
+ help='Custom display name in BLE server mode, uses "BLE Serial Server {PID}" otherwise. Prefix for logs lines in all modes.')
dev_group.add_argument('-d', '--dev', dest='device', required=False,
help='BLE device address to connect (hex format, can be separated by colons)')
dev_group.add_argument('-a', '--address-type', dest='addr_type', required=False, choices=['public', 'random'], default='public',
help='BLE address type, only relevant on Windows, ignored otherwise')
dev_group.add_argument('-s', '--service-uuid', dest='service_uuid', required=False,
- help='The service used for scanning of potential devices')
-
- dev_group.add_argument('-w', '--write-uuid', dest='write_uuid', required=False,
- help='The GATT characteristic to write the serial data, you might use "ble-scan -d" to find it out')
+ help='''In "client" mode - service UUID used for scanning of potential devices.
+ In "server" mode - service UUID used to provide read/write GATT characteristics. ''')
dev_group.add_argument('-r', '--read-uuid', dest='read_uuid', required=False,
- help='The GATT characteristic to subscribe to notifications to read the serial data')
+ help='''The GATT characteristic to subscribe to notifications to read the serial data.
+ If omitted, will be auto generated based on service UUID''')
+ dev_group.add_argument('-w', '--write-uuid', dest='write_uuid', required=False,
+ help='''The GATT characteristic to write the serial data, you might use "ble-scan -d" to find it out.
+ If omitted, will be auto generated based on service UUID''')
dev_group.add_argument('--permit', dest='mode', required=False, default='rw', choices=['ro', 'rw', 'wo'],
help='Restrict transfer direction on bluetooth: read only (ro), read+write (rw), write only (wo)')
dev_group.add_argument('--write-with-response', dest='write_with_response', required=False, action='store_true',
- help='Wait for a response from the remote device before sending more. Better data integrity, higher latency and less througput')
+ help='Wait for a response from the remote device before sending more. Better data integrity, higher latency and less throughput')
+
log_group = parser.add_argument_group('logging options')
log_group.add_argument('-l', '--log', dest='filename', required=False,
@@ -51,7 +58,21 @@ def parse_args():
args = parser.parse_args()
+ client_checks(parser, args)
+ server_checks(parser, args)
+
+ return args
+
+
+def client_checks(parser: ArgumentParser, args: Namespace):
+ if args.gap_role != 'client':
+ return # not applicable
if not args.device and not args.service_uuid:
parser.error('at least one of -d/--dev and -s/--service-uuid required')
- return args
\ No newline at end of file
+def server_checks(parser: ArgumentParser, args: Namespace):
+ if args.gap_role != 'server':
+ return # not applicable
+ if not args.service_uuid:
+ parser.error('Server role requires -s/--service-uuid')
+
diff --git a/ble_serial/log/console_log.py b/ble_serial/log/console_log.py
index 90f5c70..43c7cd8 100644
--- a/ble_serial/log/console_log.py
+++ b/ble_serial/log/console_log.py
@@ -1,9 +1,16 @@
import logging
import coloredlogs
-def setup_logger(verbosity: int):
- bleak_logger = logging.getLogger('bleak')
- bleak_logger.level = logging.DEBUG if verbosity > 1 else logging.INFO
+def _map_role_to_lib(role: str):
+ return {
+ 'client': 'bleak',
+ 'server': 'bless'
+ }[role]
+
+def setup_logger(verbosity: int, role: str, prefix_id: str):
+ ble_lib_name = _map_role_to_lib(role)
+ ble_logger = logging.getLogger(ble_lib_name)
+ ble_logger.level = logging.DEBUG if verbosity > 1 else logging.INFO
level_colors = {
'critical': {'bold': True, 'color': 'red'},
@@ -18,9 +25,11 @@ def setup_logger(verbosity: int):
'levelname': {'color': 'magenta'},
'filename': {'color': 'white', 'faint': True},
}
+
+ prefix = f'[{coloredlogs.ansi_wrap(prefix_id, bold=True)}] ' if prefix_id else ''
coloredlogs.install(
level=logging.DEBUG if verbosity > 0 else logging.INFO,
- fmt='%(asctime)s.%(msecs)03d | %(levelname)s | %(filename)s: %(message)s',
+ fmt=prefix+'%(asctime)s.%(msecs)03d | %(levelname)s | %(filename)s: %(message)s',
datefmt='%H:%M:%S',
level_styles=level_colors,
field_styles=field_colors,
diff --git a/ble_serial/main.py b/ble_serial/main.py
index 15cde90..2fa8596 100644
--- a/ble_serial/main.py
+++ b/ble_serial/main.py
@@ -3,7 +3,6 @@
from bleak.exc import BleakError
from ble_serial import platform_uart as UART
from ble_serial.ports.tcp_socket import TCP_Socket
-from ble_serial.bluetooth.ble_interface import BLE_interface
from ble_serial.log.fs_log import FS_log, Direction
from ble_serial.log.console_log import setup_logger
from ble_serial import cli
@@ -12,6 +11,12 @@ class Main():
def __init__(self, args: cli.Namespace):
self.args = args
+ if args.gap_role == 'client':
+ from ble_serial.bluetooth.ble_client import BLE_client as BLE
+ elif args.gap_role == 'server':
+ from ble_serial.bluetooth.ble_server import BLE_server as BLE
+ self.BLE_class = BLE
+
def start(self):
try:
logging.debug(f'Running: {self.args}')
@@ -31,7 +36,7 @@ async def _run(self):
else:
self.uart = UART(args.port, loop, args.mtu)
- self.bt = BLE_interface(args.adapter, args.service_uuid)
+ self.bt = self.BLE_class(args.adapter, args.gap_name)
if args.filename:
self.log = FS_log(args.filename, args.binlog)
@@ -42,12 +47,18 @@ async def _run(self):
self.uart.set_receiver(self.bt.queue_send)
self.uart.start()
- await self.bt.connect(args.device, args.addr_type, args.timeout)
- await self.bt.setup_chars(args.write_uuid, args.read_uuid, args.mode, args.write_with_response)
+
+ if args.gap_role == 'client':
+ await self.bt.connect(args.device, args.addr_type, args.service_uuid, args.timeout)
+ await self.bt.setup_chars(args.write_uuid, args.read_uuid, args.mode, args.write_with_response)
+ elif args.gap_role == 'server':
+ await self.bt.setup_chars(args.service_uuid, args.write_uuid, args.read_uuid, args.mode, args.write_with_response)
+ await self.bt.start(args.timeout)
logging.info('Running main loop!')
main_tasks = {
asyncio.create_task(self.bt.send_loop()),
+ asyncio.create_task(self.bt.check_loop()),
asyncio.create_task(self.uart.run_loop())
}
done, pending = await asyncio.wait(main_tasks, return_when=asyncio.FIRST_COMPLETED)
@@ -55,12 +66,13 @@ async def _run(self):
logging.debug(f'Pending Tasks: {[t._coro for t in pending]}')
except BleakError as e:
- logging.error(f'Bluetooth connection failed: {e}')
+ logging.error(f'Bluetooth connection failed')
+ logging.exception(e)
### KeyboardInterrupts are now received on asyncio.run()
# except KeyboardInterrupt:
# logging.info('Keyboard interrupt received')
except Exception as e:
- logging.error(f'Unexpected Error: {repr(e)}')
+ logging.exception(e)
finally:
logging.warning('Shutdown initiated')
if hasattr(self, 'uart'):
@@ -76,10 +88,12 @@ def excp_handler(self, loop: asyncio.AbstractEventLoop, context):
# Handles exception from other tasks (inside bleak disconnect, etc)
# loop.default_exception_handler(context)
logging.debug(f'Asyncio execption handler called {context["exception"]}')
+ logging.exception(context["exception"])
+
self.uart.stop_loop()
self.bt.stop_loop()
def launch():
args = cli.parse_args()
- setup_logger(args.verbose)
+ setup_logger(args.verbose, args.gap_role, args.gap_name)
Main(args).start()
\ No newline at end of file
diff --git a/examples/ble-server.py b/examples/ble-server.py
new file mode 100644
index 0000000..5b36a64
--- /dev/null
+++ b/examples/ble-server.py
@@ -0,0 +1,37 @@
+import asyncio
+import logging
+from ble_serial.bluetooth.ble_server import BLE_server
+
+def receive_callback(value: bytes):
+ print("Received:", value)
+
+async def hello_sender(ble: BLE_server):
+ while True:
+ await asyncio.sleep(3.0)
+ print("Sending...")
+ ble.queue_send(b"Hello world\n")
+
+async def main():
+ # At least service uuid requiered here, Nordic UART service for example
+ ADAPTER = "hci0"
+ SERVICE_UUID = '6E400001-B5A3-F393-E0A9-E50E24DCCA9E'
+ WRITE_UUID = None
+ READ_UUID = None
+ WRITE_WITH_RESPONSE = False
+
+ ble = BLE_server(ADAPTER, 'ExampleServer')
+ ble.set_receiver(receive_callback)
+
+ try:
+ await ble.setup_chars(SERVICE_UUID, WRITE_UUID, READ_UUID, 'rw', WRITE_WITH_RESPONSE)
+ await ble.start(0) # timeout does not matter
+
+ await asyncio.gather(ble.send_loop(), ble.check_loop(), hello_sender(ble))
+ finally:
+ ble.stop_loop()
+ await ble.disconnect()
+
+
+if __name__ == "__main__":
+ logging.basicConfig(level=logging.INFO)
+ asyncio.run(main())
\ No newline at end of file
diff --git a/examples/ble_scan_standalone.py b/examples/ble_scan_standalone.py
index 42b1864..0bc1cde 100644
--- a/examples/ble_scan_standalone.py
+++ b/examples/ble_scan_standalone.py
@@ -6,15 +6,16 @@ async def main():
ADAPTER = "hci0"
SCAN_TIME = 5 #seconds
SERVICE_UUID = None # optional filtering
+ VERBOSE = False
devices = await scanner.scan(ADAPTER, SCAN_TIME, SERVICE_UUID)
print() # newline
- scanner.print_list(devices)
-
- # manual indexing
- print(devices[0].name, devices[0].address)
+ scanner.print_list(devices, VERBOSE)
+ # manual indexing of devices dict
+ dev_list = list(devices.values())
+ print(dev_list[0])
### deep scan get's services/characteristics
DEVICE = "20:91:48:4C:4C:54"
diff --git a/examples/ble_standalone.py b/examples/ble_standalone.py
index 768ee39..a9d0b6e 100644
--- a/examples/ble_standalone.py
+++ b/examples/ble_standalone.py
@@ -1,11 +1,11 @@
import asyncio
import logging
-from ble_serial.bluetooth.ble_interface import BLE_interface
+from ble_serial.bluetooth.ble_client import BLE_client
def receive_callback(value: bytes):
print("Received:", value)
-async def hello_sender(ble: BLE_interface):
+async def hello_sender(ble: BLE_client):
while True:
await asyncio.sleep(3.0)
print("Sending...")
@@ -18,13 +18,14 @@ async def main():
WRITE_UUID = None
READ_UUID = None
DEVICE = "20:91:48:4C:4C:54"
+ WRITE_WITH_RESPONSE = False
- ble = BLE_interface(ADAPTER, SERVICE_UUID)
+ ble = BLE_client(ADAPTER, 'ID')
ble.set_receiver(receive_callback)
try:
- await ble.connect(DEVICE, "public", 10.0)
- await ble.setup_chars(WRITE_UUID, READ_UUID, "rw")
+ await ble.connect(DEVICE, "public", SERVICE_UUID, 10.0)
+ await ble.setup_chars(WRITE_UUID, READ_UUID, "rw", WRITE_WITH_RESPONSE)
await asyncio.gather(ble.send_loop(), hello_sender(ble))
finally:
diff --git a/examples/ble_standalone_sync_rx.py b/examples/ble_standalone_sync_rx.py
index 8020277..17d8a5a 100644
--- a/examples/ble_standalone_sync_rx.py
+++ b/examples/ble_standalone_sync_rx.py
@@ -1,6 +1,6 @@
import asyncio
import logging
-from ble_serial.bluetooth.ble_interface import BLE_interface
+from ble_serial.bluetooth.ble_client import BLE_client
rx_buffer = b''
rx_available = asyncio.Event()
@@ -11,7 +11,7 @@ def receive_callback(value: bytes):
rx_buffer = value
rx_available.set()
-async def sendble(ble: BLE_interface, cmd):
+async def sendble(ble: BLE_client, cmd):
#while True:
await asyncio.sleep(3.0)
@@ -24,7 +24,7 @@ async def sendble(ble: BLE_interface, cmd):
await asyncio.sleep(3.0)
-async def commander(ble: BLE_interface):
+async def commander(ble: BLE_client):
await sendble(ble, b'$C$')
await sendble(ble, b'$B$')
await ble.disconnect()
@@ -36,12 +36,13 @@ async def main():
WRITE_UUID = None
READ_UUID = None
DEVICE = "20:91:48:4C:4C:54"
+ WRITE_WITH_RESPONSE = False
- ble = BLE_interface(ADAPTER, SERVICE_UUID)
+ ble = BLE_client(ADAPTER, SERVICE_UUID)
ble.set_receiver(receive_callback)
- await ble.connect(DEVICE, "public", 10.0)
- await ble.setup_chars(WRITE_UUID, READ_UUID, "rw")
+ await ble.connect(DEVICE, "public", SERVICE_UUID, 10.0)
+ await ble.setup_chars(WRITE_UUID, READ_UUID, "rw", WRITE_WITH_RESPONSE)
await asyncio.gather(ble.send_loop(), commander(ble))
diff --git a/helper/autoconnect.ini b/helper/autoconnect.ini
index e49aca9..c55432e 100644
--- a/helper/autoconnect.ini
+++ b/helper/autoconnect.ini
@@ -19,4 +19,5 @@ port = /tmp/ttyBLE
dev = 20:91:48:4C:4C:54
address-type = public
port = /tmp/UT61E
+name = multimeter
verbose
diff --git a/helper/ble-autoconnect.py b/helper/ble-autoconnect.py
index b329a84..4ba3edd 100644
--- a/helper/ble-autoconnect.py
+++ b/helper/ble-autoconnect.py
@@ -7,15 +7,18 @@
import asyncio
from bleak import BleakScanner
from bleak.backends.device import BLEDevice
-
-import subprocess
import signal
import argparse
import configparser
import logging
-async def run_tool(conf_section: dict):
- await scanner.stop()
+async def run_tool(conf_section: dict, lock_id: str):
+ logging.info(f'{locked_devices}')
+ if lock_id in locked_devices:
+ return # already running for this device
+
+ locked_devices.append(lock_id)
+ loop.create_task(pause_scan(args.timeout))
params = [conf_section['executable']] # binary name before args
for key, val in conf_section.items():
@@ -26,30 +29,39 @@ async def run_tool(conf_section: dict):
logging.info(params)
# Run target, passthrough stdout/stderr
- proc = subprocess.run(params)
- logging.debug(f'-> target exit code: {proc.returncode}')
+ proc = await asyncio.subprocess.create_subprocess_exec(*params)
+ await proc.communicate()
+ logging.info(f'-> target exit code: {proc.returncode}')
# Restart scanner
- await scanner.start()
+ locked_devices.remove(lock_id)
-def detection_callback(device: BLEDevice, advertisement_data):
- logging.info(f'{device.address} = {device.name} (RSSI: {device.rssi})')
+def detection_callback(device: BLEDevice, adv_data):
+ logging.info(f'{device.address} = {adv_data.local_name} (RSSI: {adv_data.rssi}) Services={adv_data.service_uuids}')
if device.address in config:
section = config[device.address]
logging.info(f'Found {device.address} in config!')
- loop.create_task(run_tool(section))
+ if int(adv_data.rssi) <= args.min_rssi:
+ logging.info('Ignoring device because of low rssi')
+ return # device not actually availible
+ loop.create_task(run_tool(section, device.address))
else:
logging.debug('-> Unknown device')
+# Pause is needed to receive the advertisment in the recently started ble-serial otherwise autoconnect captures all
+async def pause_scan(secs: int):
+ await scanner.stop()
+ await asyncio.sleep(secs)
+ await scanner.start()
+
def stop(signal, stackframe=None):
logging.warning(f'signal {signal} received. Stopping scan!')
loop.create_task(scanner.stop())
loop.stop()
async def start_scan():
- scanner.register_detection_callback(detection_callback)
await scanner.start()
signal.signal(signal.SIGINT, stop)
@@ -62,17 +74,22 @@ async def start_scan():
help='Path to a INI file with device configs')
parser.add_argument('-v', '--verbose', dest='verbose', action='store_true',
help='Increase log level from info to debug')
+ parser.add_argument('-m', '--min-rssi', dest='min_rssi', default=-127, type=int,
+ help='Ignore devices with weaker signal strength')
+ parser.add_argument('-t', '--timeout', dest='timeout', default=10, type=int,
+ help='Pause scan for seconds amount to let ble-serial start up')
args = parser.parse_args()
- logging.basicConfig(format='[%(levelname)s] %(message)s',
+ logging.basicConfig(format='[AUTOCONNECT] %(asctime)s | %(levelname)s | %(message)s',
level=logging.DEBUG if args.verbose else logging.INFO)
config = configparser.ConfigParser(allow_no_value=True)
with open(args.config, 'r') as f: # do it like this to detect non existand files
config.read_file(f)
- scanner = BleakScanner()
+ scanner = BleakScanner(detection_callback)
+ locked_devices = [] # list of uuids
- loop = asyncio.get_event_loop()
+ loop = asyncio.new_event_loop()
loop.create_task(start_scan())
loop.run_forever()
diff --git a/setup.py b/setup.py
index 88d47f0..d70a336 100644
--- a/setup.py
+++ b/setup.py
@@ -43,6 +43,9 @@
],
python_requires='>=3.8',
install_requires=REQUIRES,
+ extras_require={
+ "server": 'bless >= 0.2.4',
+ },
entry_points={
'console_scripts': [
'ble-scan=ble_serial.scan.main:launch',
| diff --git a/tests/.gitignore b/tests/.gitignore
new file mode 100644
index 0000000..2aeb006
--- /dev/null
+++ b/tests/.gitignore
@@ -0,0 +1,1 @@
+device_id.py
\ No newline at end of file
diff --git a/tests/endpoints.py b/tests/endpoints.py
new file mode 100644
index 0000000..da20596
--- /dev/null
+++ b/tests/endpoints.py
@@ -0,0 +1,8 @@
+class SerialPath:
+ uart = '/dev/ttyUSB0'
+ ble = '/tmp/ttyBLE'
+
+class IP_TCP:
+ rtl8761_usb = 'localhost'
+ intel8265_zenbook = '192.168.1.97'
+# default port 4444
\ No newline at end of file
diff --git a/tests/network_handler.py b/tests/network_handler.py
new file mode 100644
index 0000000..c6db499
--- /dev/null
+++ b/tests/network_handler.py
@@ -0,0 +1,51 @@
+import socket
+from time import sleep, perf_counter
+
+def read_tcp(addr: str, port: int, expected_size: int) -> dict:
+ buffer = bytearray()
+ timeout = 8.0
+
+ with socket.socket() as test_client:
+ test_client.settimeout(timeout)
+ test_client.connect((addr, port))
+ print(f'Connected to {test_client}')
+
+ t1 = perf_counter()
+ while True:
+ try:
+ buffer += test_client.recv(1024)
+ except TimeoutError:
+ break
+
+ buffer_size = len(buffer)
+ print(f'\rReceived {buffer_size} / {expected_size} = {buffer_size/expected_size*100:.2f} %', end='')
+ print()
+ total_time = perf_counter() - t1 - timeout # timeout is always included at the end
+
+ return {
+ 'total_time': total_time,
+ 'buffer': buffer
+ }
+
+def write_tcp(addr: str, port: int, data: bytes, chunk_size: int, delay: float):
+ data_len = len(data)
+ timeout = 1.0
+
+ with socket.socket() as test_client:
+ test_client.settimeout(timeout)
+ test_client.connect((addr, port))
+ print(f'Connected to {test_client}')
+
+ t1 = perf_counter()
+ for i in range(0, data_len, chunk_size):
+ start, end = (i, i+chunk_size)
+ test_client.send(data[start:end])
+ # print(f'Written {end} bytes', end='\n')
+ sleep(delay)
+
+ total_time = perf_counter()-t1
+ rate = data_len/total_time
+
+ print() # do not overwrite replaced lines
+ print(f'Completed write {data_len} bytes in {total_time:.3f} s')
+ print(f'Rate {rate:.2f} byte/s = {rate*8:.0f} bit/s = {rate*10:.0f} baud')
\ No newline at end of file
diff --git a/tests/process_handler.py b/tests/process_handler.py
new file mode 100644
index 0000000..4a4ddac
--- /dev/null
+++ b/tests/process_handler.py
@@ -0,0 +1,17 @@
+import subprocess
+import os
+import signal
+
+# Needs start after reset of set_module_baud()
+def run_ble_serial(mac, mtu, write_resp=False, tcp=False):
+ terminal = 'konsole -e'
+ binary = 'ble-serial'
+ write_with_resp = '--write-with-response' if write_resp else ''
+ tcp_port = '--expose-tcp-port 4444' if tcp else ''
+ return subprocess.run(f'{terminal} {binary} -d {mac} -v --mtu {mtu} {write_with_resp} {tcp_port}',
+ shell=True, check=True)
+
+def signal_serial_end():
+ pid = subprocess.check_output(['pgrep', 'ble-serial'])
+ # print(f'Got PID {pid}')
+ os.kill(int(pid), signal.SIGINT)
\ No newline at end of file
diff --git a/tests/requirements.txt b/tests/requirements.txt
index 266c24b..ded0393 100644
--- a/tests/requirements.txt
+++ b/tests/requirements.txt
@@ -1,3 +1,5 @@
+pytest
+pytest-html
pyserial >= 3.4.0
plotly
pandas
\ No newline at end of file
diff --git a/tests/results/log_socat_tcp.csv b/tests/results/log_socat_tcp.csv
new file mode 100644
index 0000000..510ead3
--- /dev/null
+++ b/tests/results/log_socat_tcp.csv
@@ -0,0 +1,19 @@
+dir,rated_baud,packet_size,delay,valid,loss_percent,rx_bits,rx_baud
+BLE >> UART,1,16,0,True,0.0,27640130,34550162
+UART >> BLE,1,16,0,True,0.0,29683695,37104618
+BLE >> UART,1,16,0.00016,True,0.0,580339,725424
+UART >> BLE,1,16,0.00016,True,0.0,581745,727182
+BLE >> UART,1,16,0.0032,True,0.0,39186,48983
+UART >> BLE,1,16,0.0032,True,0.0,39182,48978
+BLE >> UART,1,128,0,True,0.0,38745778,48432223
+UART >> BLE,1,128,0,True,0.0,31582955,39478693
+BLE >> UART,1,128,0.00128,True,0.0,761524,951905
+UART >> BLE,1,128,0.00128,True,0.0,761224,951530
+BLE >> UART,1,128,0.0256,True,0.0,39890,49863
+UART >> BLE,1,128,0.0256,True,0.0,39885,49856
+BLE >> UART,1,1024,0,True,0.0,47693701,59617126
+UART >> BLE,1,1024,0,True,0.0,42902308,53627886
+BLE >> UART,1,1024,0.01024,True,0.0,779711,974639
+UART >> BLE,1,1024,0.01024,True,0.0,793426,991782
+BLE >> UART,1,1024,0.2048,True,0.0,39964,49955
+UART >> BLE,1,1024,0.2048,True,0.0,39966,49957
diff --git a/tests/serial_handler.py b/tests/serial_handler.py
index 39a6d68..aae1e23 100644
--- a/tests/serial_handler.py
+++ b/tests/serial_handler.py
@@ -1,12 +1,10 @@
from serial import Serial
from time import sleep, perf_counter
-import subprocess
-import os
-import signal
-def read_serial(port: str, conn_baud: int, expected_data: bytes):
+def read_serial(port: str, conn_baud: int, expected_size: int):
buffer = bytearray()
- timeout = 1.0
+ timeout = 4.0
+
with Serial(port, conn_baud, timeout=timeout) as ser:
print(f'Connected to read serial {port}:{conn_baud}')
t1 = perf_counter()
@@ -15,17 +13,16 @@ def read_serial(port: str, conn_baud: int, expected_data: bytes):
if len(chunk) < 1:
break
buffer += chunk
- # print(f'Read {len(buffer)}')
- tt = perf_counter() - t1 - timeout # timeout is always included at the end
- rate = len(buffer)/tt
- print(f'Completed read {len(buffer)} bytes in {tt:.3f} s')
- print(f'Rate {rate:.2f} byte/s = {rate*8:.0f} bit/s = {rate*10:.0f} baud')
+ buffer_size = len(buffer)
+ print(f'\rReceived {buffer_size} / {expected_size} = {buffer_size/expected_size*100:.2f} %', end='')
+ total_time = perf_counter() - t1 - timeout # timeout is always included at the end
+ rate = len(buffer)/total_time
+ print()
+
return {
- 'valid': expected_data == buffer,
- 'loss_percent': (1 - len(buffer) / len(expected_data)) * 100,
- 'rx_bits': int(rate*8),
- 'rx_baud': int(rate*10),
+ 'total_time': total_time,
+ 'buffer': buffer
}
def write_serial(port: str, conn_baud: int, data: bytes, chunk_size: int, delay: float):
@@ -46,17 +43,3 @@ def write_serial(port: str, conn_baud: int, data: bytes, chunk_size: int, delay:
print(f'Completed write {data_len} bytes in {tt:.3f} s')
print(f'Rate {rate:.2f} byte/s = {rate*8:.0f} bit/s = {rate*10:.0f} baud')
-
-# Needs start after reset of set_module_baud()
-def run_ble_serial():
- terminal = 'konsole -e'
- binary = 'ble-serial'
- mac = '20:91:48:DF:76:D9'
- mtu = 20
- return subprocess.run(f'{terminal} {binary} -d {mac} -v --mtu {mtu}',
- shell=True, check=True)
-
-def signal_serial_end():
- pid = subprocess.check_output(['pgrep', 'ble-serial'])
- # print(f'Got PID {pid}')
- os.kill(int(pid), signal.SIGINT)
diff --git a/tests/test.py b/tests/test.py
index 7c39376..27eaefd 100644
--- a/tests/test.py
+++ b/tests/test.py
@@ -1,101 +1,104 @@
+import pytest
from concurrent.futures import ThreadPoolExecutor as TPE
-import csv
from time import sleep
-from hm11_at_config import set_module_baud
-from serial_handler import read_serial, write_serial, run_ble_serial, signal_serial_end
-
-# Interfaces
-PORT_UART = '/dev/ttyUSB0'
-PORT_BLE = '/tmp/ttyBLE'
-
-with open('../README.md', 'rb') as f:
- CONTENT = f.read()
- # print(CONTENT)
-
-# CONTENT = CONTENT[:1000]
-
-class Dir:
- _ports = [
- (PORT_BLE, PORT_UART),
- (PORT_UART, PORT_BLE),
- ]
-
- @classmethod
- def BLE_UART(cls):
- return cls(0)
-
- @classmethod
- def UART_BLE(cls):
- return cls(1)
-
- def __init__(self, dir: int):
- self.id = dir
- self.write = self._ports[dir][0]
- self.read = self._ports[dir][1]
-
- def __str__(self):
- return ('BLE >> UART', 'UART >> BLE')[self.id]
-
-class Log:
- def __init__(self, filename: str):
- fieldnames = ['dir', 'rated_baud', 'packet_size', 'delay',
- 'valid', 'loss_percent', 'rx_bits', 'rx_baud']
- self.csvfile = open(filename, 'w', newline='')
- self.writer = csv.DictWriter(self.csvfile, fieldnames=fieldnames)
- self.writer.writeheader()
-
- def write(self, data: dict):
- self.writer.writerow(data)
-
- def close(self):
- self.csvfile.close()
-
-def run_test(exc: TPE, log: Log, dir: Dir, baud: int, packet_size: int, delay: float):
- futw = executor.submit(write_serial, dir.write, baud, CONTENT, packet_size, delay)
- futr = executor.submit(read_serial, dir.read, baud, CONTENT)
-
- result = futr.result()
- result.update({
- 'dir': str(dir),
- 'rated_baud': baud,
- 'packet_size': packet_size,
- 'delay': delay,
- })
- log.write(result)
- print(result, end='\n\n')
-
-
-baud_to_test = [9600, 19200, 57600, 115200, 230400]
-prev = baud_to_test[0]
-
-# PACKET_SIZE = [4, 16, 64]
-PACKET_SIZE = [32]
-BYTE_DELAY = [0, 1/2000, 1/1000, 1/500] # bytes/sec
-
-if __name__ == "__main__":
- # Reset to start baud after fail
- # set_module_baud(PORT_UART, 19200, 9600)
- # os.remove(PORT_BLE)
-
- log = Log('results/log.csv')
-
- for baud in baud_to_test:
- print(f'\nTesting baud: {baud}')
-
- set_module_baud(PORT_UART, prev, baud)
- prev = baud
-
- with TPE(max_workers=3) as executor:
- futb = executor.submit(run_ble_serial)
- sleep(3)
-
- for size in PACKET_SIZE:
- for delay in BYTE_DELAY:
- run_test(executor, log, Dir.BLE_UART(), baud, size, size*delay)
- run_test(executor, log, Dir.UART_BLE(), baud, size, size*delay)
-
- signal_serial_end()
-
- set_module_baud(PORT_UART, prev, baud_to_test[0])
- log.close()
+from hm11_at_config import reset_baud, set_module_baud
+from serial_handler import read_serial, write_serial
+from network_handler import read_tcp, write_tcp
+from process_handler import run_ble_serial, signal_serial_end
+from tools import eval_rx, Log, gen_test_data
+
+from device_id import MacAddr
+from endpoints import SerialPath, IP_TCP
+
+
[email protected]
+def tpe():
+ return TPE(max_workers=3)
+
[email protected](params=[16*1024])
+def test_data(request):
+ return gen_test_data(request.param)
+
[email protected](scope="module", params=[57600])
+def baud(request):
+ return request.param
+
[email protected](scope="module")
+def hm10_serial(baud):
+ reset_baud(SerialPath.uart) # resets to 9600
+ set_module_baud(SerialPath.uart, 9600, baud)
+
+
[email protected](params=[20])
+def hm10_ble_client(tpe, request):
+ futb = tpe.submit(run_ble_serial, MacAddr.hm10_serial, request.param)
+ sleep(3) # wait for startup
+ yield
+ signal_serial_end()
+ sleep(3) # wait for teardown
+
[email protected](params=[20])
+def hm10_ble_client_tcp(tpe, request):
+ futb = tpe.submit(run_ble_serial, MacAddr.hm10_serial, request.param, tcp=True)
+ sleep(3) # wait for startup
+ yield
+ signal_serial_end()
+ sleep(3) # wait for teardown
+
+
[email protected](params=[64])
+def server_ble_client_remote_tcp(tpe, request):
+ futb = tpe.submit(run_ble_serial, MacAddr.intel8265_zenbook, request.param, tcp=True)
+ sleep(5) # wait for startup
+ yield
+ signal_serial_end()
+ sleep(3) # wait for teardown
+
+
[email protected](
+ "write_path, read_path", [
+ (SerialPath.uart, SerialPath.ble),
+ (SerialPath.ble, SerialPath.uart),
+])
+def test_uart_server(tpe: TPE, hm10_serial, baud, hm10_ble_client, test_data, write_path, read_path):
+ packet_size = 64
+ delay = packet_size*1/1000
+
+ futw = tpe.submit(write_serial, write_path, baud, test_data, packet_size, delay)
+ futr = tpe.submit(read_serial, read_path, baud, len(test_data))
+
+ result = eval_rx(**futr.result(), expected_data=test_data)
+ print(result)
+
+
+def test_uart_server_tcp(tpe: TPE, hm10_serial, baud, hm10_ble_client_tcp, test_data):
+ packet_size = 64
+ delay = packet_size*1/1000
+
+ futw = tpe.submit(write_serial, SerialPath.uart, baud, test_data, packet_size, delay)
+ futr = tpe.submit(read_tcp, IP_TCP.rtl8761_usb, 4444, len(test_data))
+ result = eval_rx(**futr.result(), expected_data=test_data)
+ print(result)
+ sleep(5)
+
+ futw = tpe.submit(write_tcp, IP_TCP.rtl8761_usb, 4444, test_data, packet_size, delay)
+ futr = tpe.submit(read_serial, SerialPath.uart, baud, len(test_data))
+ result = eval_rx(**futr.result(), expected_data=test_data)
+ print(result)
+
+
[email protected](
+ "write_ip, read_ip", [
+ (IP_TCP.rtl8761_usb, IP_TCP.intel8265_zenbook),
+ (IP_TCP.intel8265_zenbook, IP_TCP.rtl8761_usb),
+])
+def test_remote_server_tcp(tpe: TPE, server_ble_client_remote_tcp, test_data, write_ip, read_ip):
+ packet_size = 64
+ delay = packet_size*1/3600
+
+ futw = tpe.submit(write_tcp, write_ip, 4444, test_data, packet_size, delay)
+ futr = tpe.submit(read_tcp, read_ip, 4444, len(test_data))
+
+ result = eval_rx(**futr.result(), expected_data=test_data)
+ print(result)
\ No newline at end of file
diff --git a/tests/tools.py b/tests/tools.py
new file mode 100644
index 0000000..7354ee5
--- /dev/null
+++ b/tests/tools.py
@@ -0,0 +1,61 @@
+from concurrent.futures import ThreadPoolExecutor as TPE
+import csv
+
+def gen_test_data(byte_amount: int):
+ with open('../README.md', 'rb') as f:
+ content = f.read()
+ # print(CONTENT)
+
+ while byte_amount > len(content):
+ content += content
+ return content[:byte_amount]
+
+
+class Log:
+ def __init__(self, filename: str):
+ fieldnames = ['dir', 'rated_baud', 'packet_size', 'delay',
+ 'valid', 'loss_percent', 'rx_bits', 'rx_baud']
+ self.csvfile = open(filename, 'w', newline='')
+ self.writer = csv.DictWriter(self.csvfile, fieldnames=fieldnames)
+ self.writer.writeheader()
+
+ def write(self, data: dict):
+ self.writer.writerow(data)
+
+ def close(self):
+ self.csvfile.close()
+
+
+# def run_test(exc: TPE, log: Log, dir: Dir, baud: int, packet_size: int, delay: float):
+# futw = executor.submit(write_serial, dir.write, baud, CONTENT, packet_size, delay)
+# futr = executor.submit(read_serial, dir.read, baud, CONTENT)
+
+# result = futr.result()
+# result.update({
+# 'dir': str(dir),
+# 'rated_baud': baud,
+# 'packet_size': packet_size,
+# 'delay': delay,
+# })
+# log.write(result)
+# print(result, end='\n\n')
+
+
+def eval_rx(buffer: bytes, expected_data: bytes, total_time: float) -> dict:
+ rate = len(buffer)/total_time
+
+ print(f'Completed read {len(buffer)} bytes in {total_time:.3f} s')
+ print(f'Rate {rate:.2f} byte/s = {rate*8:.0f} bit/s = {rate*10:.0f} baud')
+
+ with open('/tmp/base.md', 'wb') as f:
+ f.write(expected_data)
+ with open('/tmp/buffer.md', 'wb') as f:
+ f.write(buffer)
+
+ assert expected_data == buffer
+ return {
+ 'valid': expected_data == buffer,
+ 'loss_percent': (1 - len(buffer) / len(expected_data)) * 100,
+ 'rx_bits': int(rate*8),
+ 'rx_baud': int(rate*10),
+ }
| Autoconnect w/ multiple devices?
I'm not sure if this is a bug or the intended behavior. I have two HM-10 devices active and have created a section in the autoconnect.ini file for each. One gets associated with the BLE virtual port and the other with BLE2. I have created both port pairs: BLE<-->COM7 and BL2<-->COM8. Both work individually if I connect manually, but with the autoconnect.py script only one of them (seemingly at random) is ever connected. After the connect it announces that it's entering the main loop and that's it! Shouldn't it continue the scan and also pickup the other device? Or, am I misunderstanding the purpose of the program?
| 2022-06-12T16:15:03 |
|
alecthomas/importmagic | 17 | alecthomas__importmagic-17 | ['15'] | 27bf8d24a5e37cc334ef0027f5483d9edbdc10b0 | diff --git a/importmagic/importer.py b/importmagic/importer.py
index d1eb026..1d730f7 100644
--- a/importmagic/importer.py
+++ b/importmagic/importer.py
@@ -70,6 +70,11 @@ def __lt__(self, other):
class Imports(object):
+
+ _style = {'multiline': 'parentheses',
+ 'max_columns': 80,
+ }
+
def __init__(self, index, source):
self._imports = set()
self._imports_from = defaultdict(set)
@@ -78,6 +83,10 @@ def __init__(self, index, source):
self._index = index
self._parse(source)
+ @classmethod
+ def set_style(cls, **kwargs):
+ cls._style.update(kwargs)
+
def add_import(self, name, alias=None):
location = LOCATION_ORDER.index(self._index.location_for(name))
self._imports.add(Import(location, name, alias))
@@ -117,19 +126,37 @@ def get_update(self):
alias=' as {alias}'.format(alias=i.alias) if i.alias else ''
) for i in imports]
clauses.reverse()
+ line_len = len(line)
+ line_pieces = []
+ paren_used = False
while clauses:
clause = clauses.pop()
- if len(line) + len(clause) + 1 > 80:
- line += '\\\n'
- out.write(line)
+ next_len = line_len + len(clause) + 2
+ if next_len > self._style['max_columns']:
+ imported_items = ', '.join(line_pieces)
+ if self._style['multiline'] == 'parentheses':
+ line_tail = ',\n'
+ if not paren_used:
+ line += '('
+ paren_used = True
+ line_pieces.append('\n')
+ else:
+ # Use a backslash
+ line_tail = ', \\\n'
+ out.write(line + imported_items + line_tail)
line = ' '
- line += clause + (', ' if clauses else '')
+ line_len = len(line) + len(clause) + 2
+ line_pieces = [clause]
+ else:
+ line_pieces.append(clause)
+ line_len = next_len
+ line += ', '.join(line_pieces) + (')\n' if paren_used else '\n')
if line.strip():
- out.write(line + '\n')
+ out.write(line)
text = out.getvalue()
if text:
- groups.append(out.getvalue())
+ groups.append(text)
start = self._tokens[self._imports_begin][2][0] - 1
end = self._tokens[min(len(self._tokens) - 1, self._imports_end)][2][0] - 1
| diff --git a/importmagic/importer_test.py b/importmagic/importer_test.py
index ca9a3c7..a67e4b2 100644
--- a/importmagic/importer_test.py
+++ b/importmagic/importer_test.py
@@ -2,7 +2,7 @@
from textwrap import dedent
-from importmagic.importer import update_imports, get_update
+from importmagic.importer import Imports, get_update, update_imports
from importmagic.symbols import Scope
@@ -236,7 +236,8 @@ def test_from_import_as(index):
assert src == new_src.strip()
-def test_importer_wrapping(index):
+def test_importer_wrapping_escaped(index):
+ Imports.set_style(multiline='backslash', max_columns=80)
src = dedent('''
from injector import Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton
from waffle import stuff
@@ -256,6 +257,95 @@ def test_importer_wrapping(index):
new_src = update_imports(src, index, *scope.find_unresolved_and_unreferenced_symbols()).strip()
assert expected_src == new_src
+def test_importer_wrapping_escaped_longer(index):
+ Imports.set_style(multiline='backslash', max_columns=80)
+ src = dedent('''
+ from injector import Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum
+ from waffle import stuff
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum, stuff
+ ''').strip()
+ expected_src = dedent('''
+ from injector import Binder, Injector, InstanceProvider, Key, MappingKey, \\
+ Module, Scope, ScopeDecorator, SequenceKey, bar, baz, cux, foo, imported, \\
+ inject, ipsum, lorem, more, provides, singleton, things
+ from waffle import stuff
+
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum, stuff
+ ''').strip()
+
+ scope = Scope.from_source(src)
+ new_src = update_imports(src, index, *scope.find_unresolved_and_unreferenced_symbols()).strip()
+ assert expected_src == new_src
+
+def test_importer_wrapping_parentheses(index):
+ Imports.set_style(multiline='parentheses', max_columns=80)
+ src = dedent('''
+ from injector import Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton
+ from waffle import stuff
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, stuff
+ ''').strip()
+ expected_src = dedent('''
+ from injector import (Binder, Injector, InstanceProvider, Key, MappingKey,
+ Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton)
+ from waffle import stuff
+
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, stuff
+ ''').strip()
+
+ scope = Scope.from_source(src)
+ new_src = update_imports(src, index, *scope.find_unresolved_and_unreferenced_symbols()).strip()
+ assert expected_src == new_src
+
+
+def test_importer_wrapping_parentheses_longer(index):
+ Imports.set_style(multiline='parentheses', max_columns=80)
+ src = dedent('''
+ from injector import Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum
+ from waffle import stuff
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum, stuff
+ ''').strip()
+ expected_src = dedent('''
+ from injector import (Binder, Injector, InstanceProvider, Key, MappingKey,
+ Module, Scope, ScopeDecorator, SequenceKey, bar, baz, cux, foo, imported,
+ inject, ipsum, lorem, more, provides, singleton, things)
+ from waffle import stuff
+
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum, stuff
+ ''').strip()
+
+ scope = Scope.from_source(src)
+ new_src = update_imports(src, index, *scope.find_unresolved_and_unreferenced_symbols()).strip()
+ assert expected_src == new_src
+
+
+def test_importer_wrapping_colums(index):
+ Imports.set_style(multiline='parentheses', max_columns=120)
+ src = dedent('''
+ from injector import Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum
+ from waffle import stuff
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum, stuff
+ ''').strip()
+ expected_src = dedent('''
+ from injector import (Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey,
+ bar, baz, cux, foo, imported, inject, ipsum, lorem, more, provides, singleton, things)
+ from waffle import stuff
+
+
+ Binder, Injector, InstanceProvider, Key, MappingKey, Module, Scope, ScopeDecorator, SequenceKey, inject, provides, singleton, more, things, imported, foo, bar, baz, cux, lorem, ipsum, stuff
+ ''').strip()
+
+ scope = Scope.from_source(src)
+ new_src = update_imports(src, index, *scope.find_unresolved_and_unreferenced_symbols()).strip()
+
+ assert expected_src == new_src
+
def test_importer_directives(index):
src = dedent('''
| importmagic converts multiline imports using parenthesis to slashed
It should keep whatever style is in use.
Thanks for your amazing work! :)
| Agreed.
@birkenfeld are you planning on working on this anytime soon?
If not, I will try to do it when I have some spare time.
Sure, go ahead :)
| 2015-06-11T09:07:13 |
mstuttgart/brazilcep | 86 | mstuttgart__brazilcep-86 | ['40'] | df48160a1b249cbe1a37a223b527b9f3f1c53391 | diff --git a/.pylintrc b/.pylintrc
new file mode 100644
index 0000000..be83b06
--- /dev/null
+++ b/.pylintrc
@@ -0,0 +1,632 @@
+[MAIN]
+
+# Analyse import fallback blocks. This can be used to support both Python 2 and
+# 3 compatible code, which means that the block might have code that exists
+# only in one or another interpreter, leading to false positives when analysed.
+analyse-fallback-blocks=no
+
+# Clear in-memory caches upon conclusion of linting. Useful if running pylint
+# in a server-like mode.
+clear-cache-post-run=no
+
+# Load and enable all available extensions. Use --list-extensions to see a list
+# all available extensions.
+#enable-all-extensions=
+
+# In error mode, messages with a category besides ERROR or FATAL are
+# suppressed, and no reports are done by default. Error mode is compatible with
+# disabling specific errors.
+#errors-only=
+
+# Always return a 0 (non-error) status code, even if lint errors are found.
+# This is primarily useful in continuous integration scripts.
+#exit-zero=
+
+# A comma-separated list of package or module names from where C extensions may
+# be loaded. Extensions are loading into the active Python interpreter and may
+# run arbitrary code.
+extension-pkg-allow-list=
+
+# A comma-separated list of package or module names from where C extensions may
+# be loaded. Extensions are loading into the active Python interpreter and may
+# run arbitrary code. (This is an alternative name to extension-pkg-allow-list
+# for backward compatibility.)
+extension-pkg-whitelist=
+
+# Return non-zero exit code if any of these messages/categories are detected,
+# even if score is above --fail-under value. Syntax same as enable. Messages
+# specified are enabled, while categories only check already-enabled messages.
+fail-on=
+
+# Specify a score threshold under which the program will exit with error.
+fail-under=10
+
+# Interpret the stdin as a python script, whose filename needs to be passed as
+# the module_or_package argument.
+#from-stdin=
+
+# Files or directories to be skipped. They should be base names, not paths.
+ignore=CVS
+
+# Add files or directories matching the regular expressions patterns to the
+# ignore-list. The regex matches against paths and can be in Posix or Windows
+# format. Because '\\' represents the directory delimiter on Windows systems,
+# it can't be used as an escape character.
+ignore-paths=
+
+# Files or directories matching the regular expression patterns are skipped.
+# The regex matches against base names, not paths. The default value ignores
+# Emacs file locks
+ignore-patterns=^\.#
+
+# List of module names for which member attributes should not be checked
+# (useful for modules/projects where namespaces are manipulated during runtime
+# and thus existing member attributes cannot be deduced by static analysis). It
+# supports qualified module names, as well as Unix pattern matching.
+ignored-modules=
+
+# Python code to execute, usually for sys.path manipulation such as
+# pygtk.require().
+#init-hook=
+
+# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
+# number of processors available to use, and will cap the count on Windows to
+# avoid hangs.
+jobs=1
+
+# Control the amount of potential inferred values when inferring a single
+# object. This can help the performance when dealing with large functions or
+# complex, nested conditions.
+limit-inference-results=100
+
+# List of plugins (as comma separated values of python module names) to load,
+# usually to register additional checkers.
+load-plugins=
+
+# Pickle collected data for later comparisons.
+persistent=yes
+
+# Minimum Python version to use for version dependent checks. Will default to
+# the version used to run pylint.
+py-version=3.8
+
+# Discover python modules and packages in the file system subtree.
+recursive=no
+
+# Add paths to the list of the source roots. Supports globbing patterns. The
+# source root is an absolute path or a path relative to the current working
+# directory used to determine a package namespace for modules located under the
+# source root.
+source-roots=
+
+# When enabled, pylint would attempt to guess common misconfiguration and emit
+# user-friendly hints instead of false-positive error messages.
+suggestion-mode=yes
+
+# Allow loading of arbitrary C extensions. Extensions are imported into the
+# active Python interpreter and may run arbitrary code.
+unsafe-load-any-extension=no
+
+# In verbose mode, extra non-checker-related info will be displayed.
+#verbose=
+
+
+[BASIC]
+
+# Naming style matching correct argument names.
+argument-naming-style=snake_case
+
+# Regular expression matching correct argument names. Overrides argument-
+# naming-style. If left empty, argument names will be checked with the set
+# naming style.
+#argument-rgx=
+
+# Naming style matching correct attribute names.
+attr-naming-style=snake_case
+
+# Regular expression matching correct attribute names. Overrides attr-naming-
+# style. If left empty, attribute names will be checked with the set naming
+# style.
+#attr-rgx=
+
+# Bad variable names which should always be refused, separated by a comma.
+bad-names=foo,
+ bar,
+ baz,
+ toto,
+ tutu,
+ tata
+
+# Bad variable names regexes, separated by a comma. If names match any regex,
+# they will always be refused
+bad-names-rgxs=
+
+# Naming style matching correct class attribute names.
+class-attribute-naming-style=any
+
+# Regular expression matching correct class attribute names. Overrides class-
+# attribute-naming-style. If left empty, class attribute names will be checked
+# with the set naming style.
+#class-attribute-rgx=
+
+# Naming style matching correct class constant names.
+class-const-naming-style=UPPER_CASE
+
+# Regular expression matching correct class constant names. Overrides class-
+# const-naming-style. If left empty, class constant names will be checked with
+# the set naming style.
+#class-const-rgx=
+
+# Naming style matching correct class names.
+class-naming-style=PascalCase
+
+# Regular expression matching correct class names. Overrides class-naming-
+# style. If left empty, class names will be checked with the set naming style.
+#class-rgx=
+
+# Naming style matching correct constant names.
+const-naming-style=UPPER_CASE
+
+# Regular expression matching correct constant names. Overrides const-naming-
+# style. If left empty, constant names will be checked with the set naming
+# style.
+#const-rgx=
+
+# Minimum line length for functions/classes that require docstrings, shorter
+# ones are exempt.
+docstring-min-length=-1
+
+# Naming style matching correct function names.
+function-naming-style=snake_case
+
+# Regular expression matching correct function names. Overrides function-
+# naming-style. If left empty, function names will be checked with the set
+# naming style.
+#function-rgx=
+
+# Good variable names which should always be accepted, separated by a comma.
+good-names=i,
+ j,
+ k,
+ ex,
+ Run,
+ _
+
+# Good variable names regexes, separated by a comma. If names match any regex,
+# they will always be accepted
+good-names-rgxs=
+
+# Include a hint for the correct naming format with invalid-name.
+include-naming-hint=no
+
+# Naming style matching correct inline iteration names.
+inlinevar-naming-style=any
+
+# Regular expression matching correct inline iteration names. Overrides
+# inlinevar-naming-style. If left empty, inline iteration names will be checked
+# with the set naming style.
+#inlinevar-rgx=
+
+# Naming style matching correct method names.
+method-naming-style=snake_case
+
+# Regular expression matching correct method names. Overrides method-naming-
+# style. If left empty, method names will be checked with the set naming style.
+#method-rgx=
+
+# Naming style matching correct module names.
+module-naming-style=snake_case
+
+# Regular expression matching correct module names. Overrides module-naming-
+# style. If left empty, module names will be checked with the set naming style.
+#module-rgx=
+
+# Colon-delimited sets of names that determine each other's naming style when
+# the name regexes allow several styles.
+name-group=
+
+# Regular expression which should only match function or class names that do
+# not require a docstring.
+no-docstring-rgx=^_
+
+# List of decorators that produce properties, such as abc.abstractproperty. Add
+# to this list to register other decorators that produce valid properties.
+# These decorators are taken in consideration only for invalid-name.
+property-classes=abc.abstractproperty
+
+# Regular expression matching correct type alias names. If left empty, type
+# alias names will be checked with the set naming style.
+#typealias-rgx=
+
+# Regular expression matching correct type variable names. If left empty, type
+# variable names will be checked with the set naming style.
+#typevar-rgx=
+
+# Naming style matching correct variable names.
+variable-naming-style=snake_case
+
+# Regular expression matching correct variable names. Overrides variable-
+# naming-style. If left empty, variable names will be checked with the set
+# naming style.
+#variable-rgx=
+
+
+[CLASSES]
+
+# Warn about protected attribute access inside special methods
+check-protected-access-in-special-methods=no
+
+# List of method names used to declare (i.e. assign) instance attributes.
+defining-attr-methods=__init__,
+ __new__,
+ setUp,
+ asyncSetUp,
+ __post_init__
+
+# List of member names, which should be excluded from the protected access
+# warning.
+exclude-protected=_asdict,_fields,_replace,_source,_make,os._exit
+
+# List of valid names for the first argument in a class method.
+valid-classmethod-first-arg=cls
+
+# List of valid names for the first argument in a metaclass class method.
+valid-metaclass-classmethod-first-arg=mcs
+
+
+[DESIGN]
+
+# List of regular expressions of class ancestor names to ignore when counting
+# public methods (see R0903)
+exclude-too-few-public-methods=
+
+# List of qualified class names to ignore when counting class parents (see
+# R0901)
+ignored-parents=
+
+# Maximum number of arguments for function / method.
+max-args=5
+
+# Maximum number of attributes for a class (see R0902).
+max-attributes=7
+
+# Maximum number of boolean expressions in an if statement (see R0916).
+max-bool-expr=5
+
+# Maximum number of branch for function / method body.
+max-branches=12
+
+# Maximum number of locals for function / method body.
+max-locals=15
+
+# Maximum number of parents for a class (see R0901).
+max-parents=7
+
+# Maximum number of public methods for a class (see R0904).
+max-public-methods=20
+
+# Maximum number of return / yield for function / method body.
+max-returns=6
+
+# Maximum number of statements in function / method body.
+max-statements=50
+
+# Minimum number of public methods for a class (see R0903).
+min-public-methods=2
+
+
+[EXCEPTIONS]
+
+# Exceptions that will emit a warning when caught.
+overgeneral-exceptions=builtins.BaseException,builtins.Exception
+
+
+[FORMAT]
+
+# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
+expected-line-ending-format=
+
+# Regexp for a line that is allowed to be longer than the limit.
+ignore-long-lines=^\s*(# )?<?https?://\S+>?$
+
+# Number of spaces of indent required inside a hanging or continued line.
+indent-after-paren=4
+
+# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
+# tab).
+indent-string=' '
+
+# Maximum number of characters on a single line.
+max-line-length=100
+
+# Maximum number of lines in a module.
+max-module-lines=1000
+
+# Allow the body of a class to be on the same line as the declaration if body
+# contains single statement.
+single-line-class-stmt=no
+
+# Allow the body of an if to be on the same line as the test if there is no
+# else.
+single-line-if-stmt=no
+
+
+[IMPORTS]
+
+# List of modules that can be imported at any level, not just the top level
+# one.
+allow-any-import-level=
+
+# Allow explicit reexports by alias from a package __init__.
+allow-reexport-from-package=no
+
+# Allow wildcard imports from modules that define __all__.
+allow-wildcard-with-all=no
+
+# Deprecated modules which should not be used, separated by a comma.
+deprecated-modules=
+
+# Output a graph (.gv or any supported image format) of external dependencies
+# to the given file (report RP0402 must not be disabled).
+ext-import-graph=
+
+# Output a graph (.gv or any supported image format) of all (i.e. internal and
+# external) dependencies to the given file (report RP0402 must not be
+# disabled).
+import-graph=
+
+# Output a graph (.gv or any supported image format) of internal dependencies
+# to the given file (report RP0402 must not be disabled).
+int-import-graph=
+
+# Force import order to recognize a module as part of the standard
+# compatibility libraries.
+known-standard-library=
+
+# Force import order to recognize a module as part of a third party library.
+known-third-party=enchant
+
+# Couples of modules and preferred modules, separated by a comma.
+preferred-modules=
+
+
+[LOGGING]
+
+# The type of string formatting that logging methods do. `old` means using %
+# formatting, `new` is for `{}` formatting.
+logging-format-style=old
+
+# Logging modules to check that the string format arguments are in logging
+# function parameter format.
+logging-modules=logging
+
+
+[MESSAGES CONTROL]
+
+# Only show warnings with the listed confidence levels. Leave empty to show
+# all. Valid levels: HIGH, CONTROL_FLOW, INFERENCE, INFERENCE_FAILURE,
+# UNDEFINED.
+confidence=HIGH,
+ CONTROL_FLOW,
+ INFERENCE,
+ INFERENCE_FAILURE,
+ UNDEFINED
+
+# Disable the message, report, category or checker with the given id(s). You
+# can either give multiple identifiers separated by comma (,) or put this
+# option multiple times (only on the command line, not in the configuration
+# file where it should appear only once). You can also use "--disable=all" to
+# disable everything first and then re-enable specific checks. For example, if
+# you want to run only the similarities checker, you can use "--disable=all
+# --enable=similarities". If you want to run only the classes checker, but have
+# no Warning level messages displayed, use "--disable=all --enable=classes
+# --disable=W".
+disable=raw-checker-failed,
+ bad-inline-option,
+ locally-disabled,
+ file-ignored,
+ suppressed-message,
+ useless-suppression,
+ deprecated-pragma,
+ use-symbolic-message-instead,
+ missing-timeout
+
+# Enable the message, report, category or checker with the given id(s). You can
+# either give multiple identifier separated by comma (,) or put this option
+# multiple time (only on the command line, not in the configuration file where
+# it should appear only once). See also the "--disable" option for examples.
+enable=c-extension-no-member
+
+
+[METHOD_ARGS]
+
+# List of qualified names (i.e., library.method) which require a timeout
+# parameter e.g. 'requests.api.get,requests.api.post'
+timeout-methods=requests.api.delete,requests.api.get,requests.api.head,requests.api.options,requests.api.patch,requests.api.post,requests.api.put,requests.api.request
+
+
+[MISCELLANEOUS]
+
+# List of note tags to take in consideration, separated by a comma.
+notes=FIXME,
+ XXX,
+ TODO
+
+# Regular expression of note tags to take in consideration.
+notes-rgx=
+
+
+[REFACTORING]
+
+# Maximum number of nested blocks for function / method body
+max-nested-blocks=5
+
+# Complete name of functions that never returns. When checking for
+# inconsistent-return-statements if a never returning function is called then
+# it will be considered as an explicit return statement and no message will be
+# printed.
+never-returning-functions=sys.exit,argparse.parse_error
+
+
+[REPORTS]
+
+# Python expression which should return a score less than or equal to 10. You
+# have access to the variables 'fatal', 'error', 'warning', 'refactor',
+# 'convention', and 'info' which contain the number of messages in each
+# category, as well as 'statement' which is the total number of statements
+# analyzed. This score is used by the global evaluation report (RP0004).
+evaluation=max(0, 0 if fatal else 10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10))
+
+# Template used to display messages. This is a python new-style format string
+# used to format the message information. See doc for all details.
+msg-template=
+
+# Set the output format. Available formats are text, parseable, colorized, json
+# and msvs (visual studio). You can also give a reporter class, e.g.
+# mypackage.mymodule.MyReporterClass.
+#output-format=
+
+# Tells whether to display a full report or only the messages.
+reports=no
+
+# Activate the evaluation score.
+score=yes
+
+
+[SIMILARITIES]
+
+# Comments are removed from the similarity computation
+ignore-comments=yes
+
+# Docstrings are removed from the similarity computation
+ignore-docstrings=yes
+
+# Imports are removed from the similarity computation
+ignore-imports=yes
+
+# Signatures are removed from the similarity computation
+ignore-signatures=yes
+
+# Minimum lines number of a similarity.
+min-similarity-lines=4
+
+
+[SPELLING]
+
+# Limits count of emitted suggestions for spelling mistakes.
+max-spelling-suggestions=4
+
+# Spelling dictionary name. No available dictionaries : You need to install
+# both the python package and the system dependency for enchant to work..
+spelling-dict=
+
+# List of comma separated words that should be considered directives if they
+# appear at the beginning of a comment and should not be checked.
+spelling-ignore-comment-directives=fmt: on,fmt: off,noqa:,noqa,nosec,isort:skip,mypy:
+
+# List of comma separated words that should not be checked.
+spelling-ignore-words=
+
+# A path to a file that contains the private dictionary; one word per line.
+spelling-private-dict-file=
+
+# Tells whether to store unknown words to the private dictionary (see the
+# --spelling-private-dict-file option) instead of raising a message.
+spelling-store-unknown-words=no
+
+
+[STRING]
+
+# This flag controls whether inconsistent-quotes generates a warning when the
+# character used as a quote delimiter is used inconsistently within a module.
+check-quote-consistency=no
+
+# This flag controls whether the implicit-str-concat should generate a warning
+# on implicit string concatenation in sequences defined over several lines.
+check-str-concat-over-line-jumps=no
+
+
+[TYPECHECK]
+
+# List of decorators that produce context managers, such as
+# contextlib.contextmanager. Add to this list to register other decorators that
+# produce valid context managers.
+contextmanager-decorators=contextlib.contextmanager
+
+# List of members which are set dynamically and missed by pylint inference
+# system, and so shouldn't trigger E1101 when accessed. Python regular
+# expressions are accepted.
+generated-members=
+
+# Tells whether to warn about missing members when the owner of the attribute
+# is inferred to be None.
+ignore-none=yes
+
+# This flag controls whether pylint should warn about no-member and similar
+# checks whenever an opaque object is returned when inferring. The inference
+# can return multiple potential results while evaluating a Python object, but
+# some branches might not be evaluated, which results in partial inference. In
+# that case, it might be useful to still emit no-member and other checks for
+# the rest of the inferred objects.
+ignore-on-opaque-inference=yes
+
+# List of symbolic message names to ignore for Mixin members.
+ignored-checks-for-mixins=no-member,
+ not-async-context-manager,
+ not-context-manager,
+ attribute-defined-outside-init
+
+# List of class names for which member attributes should not be checked (useful
+# for classes with dynamically set attributes). This supports the use of
+# qualified names.
+ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace
+
+# Show a hint with possible names when a member name was not found. The aspect
+# of finding the hint is based on edit distance.
+missing-member-hint=yes
+
+# The minimum edit distance a name should have in order to be considered a
+# similar match for a missing member name.
+missing-member-hint-distance=1
+
+# The total number of similar names that should be taken in consideration when
+# showing a hint for a missing member.
+missing-member-max-choices=1
+
+# Regex pattern to define which classes are considered mixins.
+mixin-class-rgx=.*[Mm]ixin
+
+# List of decorators that change the signature of a decorated function.
+signature-mutators=
+
+
+[VARIABLES]
+
+# List of additional names supposed to be defined in builtins. Remember that
+# you should avoid defining new builtins when possible.
+additional-builtins=
+
+# Tells whether unused global variables should be treated as a violation.
+allow-global-unused-variables=yes
+
+# List of names allowed to shadow builtins
+allowed-redefined-builtins=
+
+# List of strings which can identify a callback function by name. A callback
+# name must start or end with one of those strings.
+callbacks=cb_,
+ _cb
+
+# A regular expression matching the name of dummy variables (i.e. expected to
+# not be used).
+dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
+
+# Argument names that match this expression will be ignored.
+ignored-argument-names=_.*|^ignored_|^unused_
+
+# Tells whether we should check for unused import in __init__ files.
+init-import=no
+
+# List of qualified module names which can have objects that can redefine
+# builtins.
+redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io
diff --git a/CHANGELOG.md b/CHANGELOG.md
index 7ea92d8..8fdac67 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,5 +1,10 @@
# Changelog
+## 6.1.0 (2023-10-01)
+
+- Add timeout parameters in requests
+- Update README typing errors
+
## 6.0.0 (2023-04-30)
- Rename lib to BrazilCEP
diff --git a/README.md b/README.md
index 6a7483d..1de9bbc 100644
--- a/README.md
+++ b/README.md
@@ -53,7 +53,6 @@ Currently supports several CEP API's:
- [ViaCEP](https://viacep.com.br)
- [ApiCEP (WideNet)](https://apicep.com)
-- [Correios (SIGEPWeb)](http://www.corporativo.correios.com.br/encomendas/sigepweb/doc/Manual_de_Implementacao_do_Web_Service_SIGEP_WEB.pdf)
> **BrazilCEP** is the new name of former **PyCEPCorreio** python library.
If you want to migrate the old code to the new version, please see the [migrate](https://brazilcep.readthedocs.io/en/latest/migrate/) section in docs.
diff --git a/brazilcep/apicep.py b/brazilcep/apicep.py
index b10c874..eb46ae9 100644
--- a/brazilcep/apicep.py
+++ b/brazilcep/apicep.py
@@ -17,18 +17,20 @@
URL = "https://ws.apicep.com/cep/{}.json"
-def fetch_address(cep):
+def fetch_address(cep, **kwargs):
"""Fetch VIACEP webservice for CEP address. VIACEP provide
- a REST API to query CEO requests.
+ a REST API to query CEP requests.
Args:
cep (str):CEP to be searched.
+ timeout (int): How many seconds to wait for the server to return data before giving up.
+ proxies (dict): Dictionary mapping protocol to the URL of the proxy.
Returns:
address (dict): respective address data from CEP.
"""
- response = requests.get(URL.format(cep), timeout=5)
+ response = requests.get(URL.format(cep), **kwargs) # pylint = missing-timeout
if response.status_code == 200:
# Transforma o objeto requests em um dict
diff --git a/brazilcep/client.py b/brazilcep/client.py
index 7d9ade3..bb93b98 100644
--- a/brazilcep/client.py
+++ b/brazilcep/client.py
@@ -8,13 +8,14 @@
:copyright: (c) 2023 by Michell Stuttgart.
:license: MIT, see LICENSE for more details.
"""
-
+import warnings
import enum
import re
-from . import apicep, correios, viacep
+from . import apicep, viacep
NUMBERS = re.compile(r"[^0-9]")
+DEFAULT_TIMEOUT = 5 # in seconds
class WebService(enum.Enum):
@@ -30,17 +31,19 @@ class WebService(enum.Enum):
services = {
- WebService.CORREIOS: correios.fetch_address,
+ WebService.CORREIOS: None,
WebService.VIACEP: viacep.fetch_address,
WebService.APICEP: apicep.fetch_address,
}
-def get_address_from_cep(cep, webservice=WebService.APICEP):
+def get_address_from_cep(cep, webservice=WebService.APICEP, timeout=None, proxies=None):
"""Returns the address corresponding to the zip (cep) code entered.
Args:
cep (str): CEP to be queried.
+ timeout (int): How many seconds to wait for the server to return data before giving up.
+ proxies (dict): Dictionary mapping protocol to the URL of the proxy.
Raises:
RequestError: When connection error occurs in CEP query
@@ -54,16 +57,33 @@ def get_address_from_cep(cep, webservice=WebService.APICEP):
"""
- if webservice not in (value for attribute, value in WebService.__dict__.items()):
+ if webservice not in (value for _, value in WebService.__dict__.items()):
raise KeyError(
"""Invalid webservice. Please use this options:
- WebService.CORREIOS, WebService.VIACEP, WebService.APICEP
+ WebService.VIACEP, WebService.APICEP
"""
)
- cep = _format_cep(cep)
+ if webservice == WebService.CORREIOS:
+ warnings.warn(
+ "CORREIOS support has been deprecated, and we intend to remove it"
+ " in a future release of BrazilCEP. Please use the WebService.VIACEP, WebService.APICEP"
+ " instead, as described in the documentation.",
+ DeprecationWarning,
+ )
+
+ # override deprecated option
+ webservice = WebService.APICEP
+
+ kwargs = {}
+
+ if timeout and isinstance(timeout, int):
+ kwargs["timeout"] = timeout
+
+ if proxies and isinstance(proxies, dict):
+ kwargs["proxies"] = proxies
- return services[webservice](_format_cep(cep))
+ return services[webservice](_format_cep(cep), **kwargs)
def _format_cep(cep):
diff --git a/brazilcep/correios.py b/brazilcep/correios.py
deleted file mode 100644
index 46e79e3..0000000
--- a/brazilcep/correios.py
+++ /dev/null
@@ -1,45 +0,0 @@
-"""
-brazilcep.correios
-~~~~~~~~~~~~~~~~
-
-This module implements the BrazilCEP Correios adapter.
-
-:copyright: (c) 2023 by Michell Stuttgart.
-:license: MIT, see LICENSE for more details.
-"""
-
-import zeep
-
-from . import exceptions
-
-URL = "https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl" # noqa
-
-
-def fetch_address(cep):
- """Fetch CORREIOS webservice for CEP address. CORREIOS provide
- a SOAP to query CEO requests.
-
- Args:
- cep (str):CEP to be searched.
-
- Returns:
- address (dict): respective address data from CEP.
- """
-
- try:
-
- client = zeep.Client(URL)
-
- address = client.service.consultaCEP(cep)
-
- return {
- "district": getattr(address, "bairro") or "",
- "cep": getattr(address, "cep") or "",
- "city": getattr(address, "cidade") or "",
- "street": getattr(address, "end") or "",
- "uf": getattr(address, "uf") or "",
- "complement": getattr(address, "complemento2") or "",
- }
-
- except zeep.exceptions.Fault as err:
- raise exceptions.BrazilCEPException(err)
diff --git a/brazilcep/viacep.py b/brazilcep/viacep.py
index f8aa76f..80f4f0a 100644
--- a/brazilcep/viacep.py
+++ b/brazilcep/viacep.py
@@ -17,18 +17,21 @@
URL = "http://www.viacep.com.br/ws/{}/json"
-def fetch_address(cep):
+def fetch_address(cep, **kwargs):
"""Fetch APICEP webservice for CEP address. APICEP provide
- a REST API to query CEO requests.
+ a REST API to query CEP requests.
Args:
cep (str):CEP to be searched.
+ timeout (int): How many seconds to wait for the server to return data before giving up.
+ proxies (dict): Dictionary mapping protocol to the URL of the proxy.
+
Returns:
address (dict): respective address data from CEP.
"""
- response = requests.get(URL.format(cep), timeout=5)
+ response = requests.get(URL.format(cep), **kwargs) # pylint = missing-timeout
if response.status_code == 200:
# Transforma o objeto requests em um dict
diff --git a/docs/CHANGELOG.md b/docs/CHANGELOG.md
index e5de155..474a708 100644
--- a/docs/CHANGELOG.md
+++ b/docs/CHANGELOG.md
@@ -1,5 +1,13 @@
# Changelog
+## 6.1.0 (2023-10-01)
+
+- Add timeout settings
+- Add proxy settings. From: https://github.com/mstuttgart/brazilcep/issues/40
+- Add real tests to APICEP and ViaCEP
+- Update docs and README
+- Deprecated 'Correios' webservice support
+
## 6.0.0 (2023-05-01)
- Rename lib to BrazilCEP
diff --git a/docs/index.md b/docs/index.md
index d55ac50..96393c7 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -60,6 +60,5 @@ Its objective is to provide a common query interface to all these search service
* Currently supports several CEP API's:
* [ViaCEP](https://viacep.com.br)
* [ApiCEP (WideNet)](https://apicep.com)
- * [Correios (SIGEPWeb)](http://www.corporativo.correios.com.br/encomendas/sigepweb/doc/Manual_de_Implementacao_do_Web_Service_SIGEP_WEB.pdf)
BrazilCEP started as a personal study project and evolved into a serious and open source project that is used by many developers on a daily basis.
diff --git a/docs/migrate.md b/docs/migrate.md
index 4c7b6ef..0298d41 100644
--- a/docs/migrate.md
+++ b/docs/migrate.md
@@ -10,7 +10,7 @@ It's simples migrate te code and require minimal steps.
First, rename the `import` statements from:
```python title="PyCEPCorreios"
-import pycepicorreios
+import pycepcorreios
```
to
diff --git a/docs/tutorial.md b/docs/tutorial.md
index deb87ef..0ffc1bb 100644
--- a/docs/tutorial.md
+++ b/docs/tutorial.md
@@ -35,6 +35,36 @@ get all the address information we need from this object:
The CEP always must be a string.
+## Timeout
+
+BrazilCEP also supports set a request timeout. Use the timeout option. The default timeout is 5 seconds:
+
+```python
+from brazilcep import get_address_from_cep
+
+# set timeout to 10 seconds
+get_address_from_cep('37503-130', timeout=10)
+
+```
+
+## Proxy
+
+BrazilCEP also supports proxy setings following *requests* pattern. For more details,
+please official *requests* doc [here](https://requests.readthedocs.io/en/latest/user/advanced/#proxies).
+
+```python
+from brazilcep import get_address_from_cep
+
+proxies = {
+ 'https': "00.00.000.000",
+ 'http': '00.00.000.000',
+}
+
+# set proxies
+get_address_from_cep('37503-130', proxies=proxies)
+
+```
+
## Unsing differents API's
!!! note
@@ -75,13 +105,6 @@ The possible values for the `webservice` parameter are:
* `Webservice.APICEP`
* `Webservice.VIACEP`
-* `Webservice.CORREIOS`
-
-!!! info
-
- The Correios CEP search service is an integral part of the SIGEPWeb service and
- to use it, it is necessary to have a contract with the Correios, as indicated
- in the Introduction chapter in the service `integration manual <>`_.
## Errors and Exceptions
diff --git a/noxfile.py b/noxfile.py
index d63bce4..9b0ecdc 100644
--- a/noxfile.py
+++ b/noxfile.py
@@ -1,6 +1,6 @@
-from nox import session
+import nox
-@session(python=['3.8', '3.9', '3.10', '3.11'], reuse_venv=True)
[email protected](python=['3.8', '3.9', '3.10', '3.11'], reuse_venv=True)
def test(session):
session.run('poetry', 'shell', external=True)
session.run('poetry', 'install', external=True)
diff --git a/pyproject.toml b/pyproject.toml
index 1298b55..bcb309c 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -1,6 +1,6 @@
[tool.poetry]
name = "brazilcep"
-version = "6.0.0"
+version = "6.1.0"
description = "Minimalist and easy-to-use python library designed to query CEP (brazilian zip codes) data"
authors = ["Michell Stuttgart <[email protected]>"]
license = "MIT"
@@ -8,7 +8,7 @@ readme = "README.md"
packages = [{include = "brazilcep"}]
include = ["CHANGELOG.md", "LICENSE"]
-keywords = ["correios", "viacep", "apicep", "cep", "brazil"]
+keywords = ["viacep", "apicep", "cep", "brazil"]
classifiers = [
"Topic :: Software Development :: Build Tools",
| diff --git a/.github/workflows/test-package.yml b/.github/workflows/test-package.yml
index 0d4d1fe..a16a546 100644
--- a/.github/workflows/test-package.yml
+++ b/.github/workflows/test-package.yml
@@ -13,7 +13,7 @@ jobs:
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
poetry-version: ["1.4.2"]
- os: [ubuntu-22.04, macos-latest, windows-latest]
+ os: [ubuntu-22.04, windows-latest]
runs-on: ${{ matrix.os }}
steps:
diff --git a/tests/test_apicep.py b/tests/test_apicep.py
index 2000470..1e49b23 100644
--- a/tests/test_apicep.py
+++ b/tests/test_apicep.py
@@ -1,8 +1,24 @@
+import os
+
import pytest
-import requests
from brazilcep import WebService, exceptions, get_address_from_cep
+IN_GITHUB_ACTIONS = os.getenv("GITHUB_ACTIONS") == "true"
+
+
[email protected](IN_GITHUB_ACTIONS, reason="Test doesn't work in Github Actions.")
+def test_fetch_address_success_real():
+ # Realizamos a consulta de CEP
+ address = get_address_from_cep("37.503-130", webservice=WebService.APICEP)
+
+ assert address["district"] == "Santo Antônio"
+ assert address["cep"] == "37503-130"
+ assert address["city"] == "Itajubá"
+ assert address["complement"] == ""
+ assert address["street"] == "Rua Geraldino Campista"
+ assert address["uf"] == "MG"
+
def test_fetch_address_success(requests_mock):
req_mock_text = """{
@@ -19,7 +35,9 @@ def test_fetch_address_success(requests_mock):
requests_mock.get("https://ws.apicep.com/cep/37503130.json", text=req_mock_text)
# Realizamos a consulta de CEP
- address = get_address_from_cep("37.503-130", webservice=WebService.APICEP)
+ address = get_address_from_cep(
+ "37.503-130", webservice=WebService.APICEP, timeout=5
+ )
assert address["district"] == "Santo Antônio"
assert address["cep"] == "37503-130"
@@ -41,8 +59,12 @@ def test_fetch_address_success(requests_mock):
requests_mock.get("https://ws.apicep.com/cep/99999999.json", text=req_mock_text)
+ proxies = {"https": "00.00.000.000", "http": "00.00.000.000"}
+
# Realizamos a consulta de CEP
- address = get_address_from_cep("99999-999", webservice=WebService.APICEP)
+ address = get_address_from_cep(
+ "99999-999", webservice=WebService.APICEP, timeout=5, proxies=proxies
+ )
assert address["district"] == ""
assert address["cep"] == "99999-999"
diff --git a/tests/test_client.py b/tests/test_client.py
index 8d1bdda..d76d102 100644
--- a/tests/test_client.py
+++ b/tests/test_client.py
@@ -1,6 +1,8 @@
+from warnings import catch_warnings
+
import pytest
-from brazilcep import get_address_from_cep
+from brazilcep import get_address_from_cep, WebService
from brazilcep.client import _format_cep
@@ -12,6 +14,27 @@ def test_search_error():
get_address_from_cep("37.503-130", webservice="VIACEP")
+def test_search_correios(requests_mock):
+ """Set mock get return"""
+ req_mock_text = """{
+ "status":200,
+ "ok":true,
+ "code":"37503-130",
+ "state":"MG",
+ "city":"Itajubá",
+ "district":"Santo Antônio",
+ "address":"Rua Geraldino Campista - até 214/215",
+ "statusText":"ok"
+ }"""
+
+ requests_mock.get("https://ws.apicep.com/cep/37503130.json", text=req_mock_text)
+
+ with catch_warnings(record=True) as warn:
+ get_address_from_cep("37503130", webservice=WebService.CORREIOS)
+ assert len(warn) == 1
+ assert issubclass(warn[0].category, DeprecationWarning)
+
+
def test_format_cep_success():
assert _format_cep("37.503-003") == "37503003"
assert _format_cep(" 37.503-003") == "37503003"
diff --git a/tests/test_correios.py b/tests/test_correios.py
deleted file mode 100644
index 7e145e0..0000000
--- a/tests/test_correios.py
+++ /dev/null
@@ -1,96 +0,0 @@
-from unittest import mock
-
-import pytest
-import zeep
-
-from brazilcep import WebService, exceptions, get_address_from_cep
-
-
[email protected]("zeep.Client")
-def test_fetch_address_success(mk):
- class MockClass:
- def __init__(self, dictionary):
- for k, v in dictionary.items():
- setattr(self, k, v)
-
- expected_address = {
- "bairro": "Santo Antônio",
- "cep": "37503130",
- "cidade": "Itajubá",
- "complemento2": "- até 214/215",
- "end": "Rua Geraldino Campista",
- "uf": "MG",
- "unidadesPostagem": [],
- }
-
- service_mk = mk.return_value.service
-
- # Criamos o mock para o valor de retorno
- service_mk.consultaCEP.return_value = MockClass(expected_address)
-
- # Realizamos a consulta de CEP
- address = get_address_from_cep("37503130", webservice=WebService.CORREIOS)
-
- assert address["district"] == "Santo Antônio"
- assert address["cep"] == "37503130"
- assert address["city"] == "Itajubá"
- assert address["complement"] == "- até 214/215"
- assert address["street"] == "Rua Geraldino Campista"
- assert address["uf"] == "MG"
-
- # Verifica se o metodo consultaCEP foi chamado
- # com os parametros corretos
- service_mk.consultaCEP.assert_called_with("37503130")
-
-
[email protected]("zeep.Client")
-def test_fetch_address_success_unique(mk):
- class MockClass:
- def __init__(self, dictionary):
- for k, v in dictionary.items():
- setattr(self, k, v)
-
- expected_address = {
- "bairro": "",
- "cep": "9999999",
- "cidade": "Sarandi",
- "complemento2": "",
- "end": "",
- "uf": "PR",
- "unidadesPostagem": [],
- }
-
- service_mk = mk.return_value.service
-
- # Criamos o mock para o valor de retorno
- service_mk.consultaCEP.return_value = MockClass(expected_address)
-
- # Realizamos a consulta de CEP
- address = get_address_from_cep("37503130", webservice=WebService.CORREIOS)
-
- assert address["district"] == ""
- assert address["cep"] == "9999999"
- assert address["city"] == "Sarandi"
- assert address["complement"] == ""
- assert address["street"] == ""
- assert address["uf"] == "PR"
-
- # Verifica se o metodo consultaCEP foi chamado
- # com os parametros corretos
- service_mk.consultaCEP.assert_called_with("37503130")
-
-
[email protected]("zeep.Client")
-def test_fetch_address_fail(mk):
- class MockClass:
- def __init__(self, dictionary):
- for k, v in dictionary.items():
- setattr(self, k, v)
-
- service_mk = mk.return_value.service
-
- # Criamos o mock para o valor de retorno
- service_mk.consultaCEP.side_effect = zeep.exceptions.Fault("error", 500)
-
- with pytest.raises(exceptions.BrazilCEPException):
- get_address_from_cep("37503130", webservice=WebService.CORREIOS)
diff --git a/tests/test_viacep.py b/tests/test_viacep.py
index 1035c56..86203fe 100644
--- a/tests/test_viacep.py
+++ b/tests/test_viacep.py
@@ -1,10 +1,27 @@
+import os
+
import pytest
-import requests
from brazilcep import WebService, exceptions, get_address_from_cep
+IN_GITHUB_ACTIONS = os.getenv("GITHUB_ACTIONS") == "true"
+
+
[email protected](IN_GITHUB_ACTIONS, reason="Test doesn't work in Github Actions.")
+def test_get_address_from_cep_success_real():
+ # Realizamos a consulta de CEP
+ address = get_address_from_cep("37.503-130", webservice=WebService.VIACEP)
+
+ assert address["district"] == "Santo Antônio"
+ assert address["cep"] == "37503-130"
+ assert address["city"] == "Itajubá"
+ assert address["complement"] == "até 214/215"
+ assert address["street"] == "Rua Geraldino Campista"
+ assert address["uf"] == "MG"
+
def test_get_address_from_cep_success(requests_mock):
+ """Set mock get return"""
req_mock_text = """{
\n "cep": "37503-130",
\n "logradouro": "Rua Geraldino Campista",
@@ -20,8 +37,12 @@ def test_get_address_from_cep_success(requests_mock):
requests_mock.get("http://www.viacep.com.br/ws/37503130/json", text=req_mock_text)
+ proxies = {"https": "00.00.000.000", "http": "00.00.000.000"}
+
# Realizamos a consulta de CEP
- address = get_address_from_cep("37.503-130", webservice=WebService.VIACEP)
+ address = get_address_from_cep(
+ "37.503-130", webservice=WebService.VIACEP, timeout=10, proxies=proxies
+ )
assert address["district"] == "Santo Antônio"
assert address["cep"] == "37503-130"
| Add support to proxy setting
Sometimes is necessary to add proxies different than the local ones. I would like to be able to set this while requesting to the correios API. Thanks!
| @nicolasassi thanks. I will work in this feature.
Nice! I think it would be a fun improvement to your already awesome solution.
As I was in a rush I did it myself, but I'll paste my code here as it my help someone while you are developing the
```
import requests
import zeep
p = {'https': "00.00.000.000", 'http': '00.00.000.000'}
s = requests.session()
s.proxies.update(p)
t = zeep.Transport(session=s, timeout=20)
url="https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl"
client = zeep.Client(url, transport=t)
endereco = client.service.consultaCEP('22795175')
``` | 2023-10-01T13:30:26 |
allenporter/flux-local | 856 | allenporter__flux-local-856 | ['855'] | d5e968783dab0a2377a3df6219f6c841cfaf764a | diff --git a/flux_local/helm.py b/flux_local/helm.py
index 8d711b4e..e312123c 100644
--- a/flux_local/helm.py
+++ b/flux_local/helm.py
@@ -266,6 +266,10 @@ async def template(
]
args.extend(self._flags)
args.extend(options.template_args)
+ if release.disable_openapi_validation:
+ args.append("--disable-openapi-validation")
+ if release.disable_schema_validation:
+ args.append("--skip-schema-validation")
if release.chart.version:
args.extend(
[
diff --git a/flux_local/manifest.py b/flux_local/manifest.py
index bbfd2fbf..4e0c0843 100644
--- a/flux_local/manifest.py
+++ b/flux_local/manifest.py
@@ -231,6 +231,16 @@ class HelmRelease(BaseManifest):
labels: dict[str, str] | None = field(metadata={"serialize": "omit"}, default=None)
"""A list of labels on the HelmRelease."""
+ disable_schema_validation: bool = field(
+ metadata={"serialize": "omit"}, default=False
+ )
+ """Prevents Helm from validating the values against the JSON Schema."""
+
+ disable_openapi_validation: bool = field(
+ metadata={"serialize": "omit"}, default=False
+ )
+ """Prevents Helm from validating the values against the Kubernetes OpenAPI Schema."""
+
@classmethod
def parse_doc(cls, doc: dict[str, Any]) -> "HelmRelease":
"""Parse a HelmRelease from a kubernetes resource object."""
@@ -248,6 +258,16 @@ def parse_doc(cls, doc: dict[str, Any]) -> "HelmRelease":
values_from = [
ValuesReference.from_dict(subdoc) for subdoc in values_from_dict
]
+ disable_schema_validation = any(
+ bag.get("disableSchemaValidation")
+ for key in ("install", "upgrade")
+ if (bag := spec.get(key)) is not None
+ )
+ disable_openapi_validation = any(
+ bag.get("disableOpenAPIValidation")
+ for key in ("install", "upgrade")
+ if (bag := spec.get(key)) is not None
+ )
return HelmRelease(
name=name,
namespace=namespace,
@@ -255,6 +275,8 @@ def parse_doc(cls, doc: dict[str, Any]) -> "HelmRelease":
values=spec.get("values"),
values_from=values_from,
labels=metadata.get("labels"),
+ disable_schema_validation=disable_schema_validation,
+ disable_openapi_validation=disable_openapi_validation,
)
@property
| diff --git a/tests/testdata/cluster9/apps/podinfo/podinfo.yaml b/tests/testdata/cluster9/apps/podinfo/podinfo.yaml
index 41e9ebf0..1e707987 100644
--- a/tests/testdata/cluster9/apps/podinfo/podinfo.yaml
+++ b/tests/testdata/cluster9/apps/podinfo/podinfo.yaml
@@ -10,5 +10,9 @@ spec:
kind: OCIRepository
name: podinfo
namespace: default
+ install:
+ disableOpenAPIValidation: true
+ upgrade:
+ disableSchemaValidation: true
values:
replicaCount: 2
diff --git a/tests/tool/__snapshots__/test_build.ambr b/tests/tool/__snapshots__/test_build.ambr
index d388b8c3..64e8de50 100644
--- a/tests/tool/__snapshots__/test_build.ambr
+++ b/tests/tool/__snapshots__/test_build.ambr
@@ -1712,7 +1712,11 @@
kind: OCIRepository
name: podinfo
namespace: default
+ install:
+ disableOpenAPIValidation: true
interval: 10m
+ upgrade:
+ disableSchemaValidation: true
values:
replicaCount: 2
@@ -6601,7 +6605,11 @@
kind: OCIRepository
name: podinfo
namespace: default
+ install:
+ disableOpenAPIValidation: true
interval: 10m
+ upgrade:
+ disableSchemaValidation: true
values:
replicaCount: 2
| Add support for helmrelease `.spec.[upgrade|install].disableSchemaValidation`
external-dns released a change that broke schema validation, see
https://github.com/onedr0p/home-ops/pull/9027
It would be nice if flux-local could read this value and disable schema validation.
| Assuming this means setting `--skip-schema-validation`. We can also support `--disable-openapi-validation` too while we're here.
That would be nice too! | 2025-03-21T13:38:30 |
serverless-operations/jeffy | 31 | serverless-operations__jeffy-31 | ['20'] | 0ffcb5715f10711187a85d901b127ab96bfb0b4a | diff --git a/README.md b/README.md
index 0a6c193..185ea2e 100644
--- a/README.md
+++ b/README.md
@@ -30,11 +30,14 @@ Mainly, Jeffy is focusing on three things.
* 2.1. [common](#common)
* 2.2. [rest_api](#rest_api)
* 2.3. [sqs](#sqs)
- * 2.4. [sns](#sns)
- * 2.5. [kinesis_streams](#kinesis_streams)
- * 2.6. [dynamodb_streams](#dynamodb_streams)
- * 2.7. [s3](#s3)
- * 2.8. [schedule](#schedule)
+ * 2.4. [sqs_raw](#sqs_raw)
+ * 2.5. [sns](#sns)
+ * 2.6. [sns_raw](#sns_raw)
+ * 2.7. [kinesis_streams](#kinesis_streams)
+ * 2.8. [kinesis_streams_raw](#kinesis_streams_raw)
+ * 2.9. [dynamodb_streams](#dynamodb_streams)
+ * 2.10. [s3](#s3)
+ * 2.11. [schedule](#schedule)
* 3. [SDK](#SDK)
* 3.1. [Kinesis Clinent](#KinesisClinent)
* 3.2. [SNS Client](#SNSClient)
@@ -271,7 +274,21 @@ def handler(event, context):
"""
```
-### 2.4. <a name='sns'></a>sns
+### 2.4. <a name='sqs_raw'></a>sqs_raw
+Decorator for sqs raw event (with all metadatas). Automaticlly parse `"event.Records"` list from SQS event source and pass the each records to main process of Lambda.
+
+Default encoding is `jeffy.encoding.json.JsonEncoding`.
+
+```python
+from jeffy.framework import get_app
+app = get_app()
+
[email protected]_raw()
+def handler(event, context):
+ return event['body']
+```
+
+### 2.5. <a name='sns'></a>sns
Decorator for sns event. Automaticlly parse `event.Records` list from SNS event source to each items for making it easy to treat it inside main process of Lambda.
Default encoding is `jeffy.encoding.json.JsonEncoding`.
@@ -295,7 +312,21 @@ def handler(event, context):
"""
```
-### 2.5. <a name='kinesis_streams'></a>kinesis_streams
+### 2.6. <a name='sns_raw'></a>sns_raw
+Decorator for sqs raw event (with all metadatas). Automaticlly parse `"event.Records"` list from SNS event source and pass the each records to main process of Lambda.
+
+Default encoding is `jeffy.encoding.json.JsonEncoding`.
+
+```python
+from jeffy.framework import get_app
+app = get_app()
+
[email protected]_raw()
+def handler(event, context):
+ return event['Sns']['Message']
+```
+
+### 2.7. <a name='kinesis_streams'></a>kinesis_streams
Decorator for kinesis stream event. Automaticlly parse `event.Records` list from Kinesis event source to each items and decode it with base64 for making it easy to treat it inside main process of Lambda.
Default encoding is `jeffy.encoding.json.JsonEncoding`.
@@ -319,7 +350,21 @@ def handler(event, context):
"""
```
-### 2.6. <a name='dynamodb_streams'></a>dynamodb_streams
+### 2.8. <a name='kinesis_streams_raw'></a>kinesis_streams_raw
+Decorator for sqs raw event (with all metadatas). Automaticlly parse `"event.Records"` list from Kinesis Data Streams event source and pass the each records to main process of Lambda.
+
+Default encoding is `jeffy.encoding.json.JsonEncoding`.
+
+```python
+from jeffy.framework import get_app
+app = get_app()
+
[email protected]_raw()
+def handler(event, context):
+ return event['kinesis']['data']
+```
+
+### 2.9. <a name='dynamodb_streams'></a>dynamodb_streams
Decorator for dynamodb stream event. Automaticlly parse `event.Records` list from Dynamodb event source to items for making it easy to treat it inside main process of Lambda.
```python
@@ -341,7 +386,7 @@ def handler(event, context):
"""
```
-### 2.7. <a name='s3'></a>s3
+### 2.10. <a name='s3'></a>s3
Decorator for S3 event. Automatically parse body stream from triggered S3 object and S3 bucket and key name to Lambda.
**This handler requires `s3:GetObject` permission.**
@@ -361,7 +406,7 @@ def handler(event, context):
event['metadata'] # object matadata
```
-### 2.8. <a name='schedule'></a>schedule
+### 2.11. <a name='schedule'></a>schedule
Decorator for schedule event. just captures correlation id before main Lambda process. do nothing other than that.
```python
diff --git a/jeffy/handlers/sns.py b/jeffy/handlers/sns.py
index 694e219..f1b7158 100644
--- a/jeffy/handlers/sns.py
+++ b/jeffy/handlers/sns.py
@@ -22,9 +22,8 @@ def sns(
Usage::
>>> from jeffy.framework import get_app
- >>> from jeffy.encoding.json import JsonEncoding
>>> app = get_app()
- >>> @app.handlers.sns(encoding=JsonEncoding())
+ >>> @app.handlers.sns()
... def handler(event, context):
... return event['body']['foo']
"""
@@ -47,3 +46,41 @@ def wrapper(event, context): # type: ignore
return ret
return wrapper
return _sns
+
+ def sns_raw(
+ self,
+ encoding: Encoding = JsonEncoding(),
+ validator: Validator = NoneValidator()
+ ) -> Callable:
+ """
+ Decorator for SNS raw event (with all metadatas).
+
+ Automatically divide 'Records' and pass the record to main process of Lambda.
+
+ Usage::
+ >>> from jeffy.framework import get_app
+ >>> app = get_app()
+ >>> @app.handlers.sns_raw()
+ ... def handler(event, context):
+ ... return event['Sns']['Message']
+ """
+ def _sns_raw(func: Callable): # type: ignore
+ @functools.wraps(func)
+ def wrapper(event, context): # type: ignore
+ ret = []
+ for record in event['Records']:
+ message = encoding.decode(record['Sns']['Message'].encode('utf-8'))
+ validator.validate(message)
+ self.capture_correlation_id(message)
+ record['Sns']['Message'] = message
+ try:
+ self.app.logger.info(message)
+ result = func(record, context)
+ self.app.logger.info(result)
+ ret.append(result)
+ except Exception as e:
+ self.app.logger.exception(e)
+ raise e
+ return ret
+ return wrapper
+ return _sns_raw
diff --git a/jeffy/handlers/sqs.py b/jeffy/handlers/sqs.py
index f662b25..6a43a05 100644
--- a/jeffy/handlers/sqs.py
+++ b/jeffy/handlers/sqs.py
@@ -47,3 +47,41 @@ def wrapper(event, context): # type: ignore
return ret
return wrapper
return _sqs
+
+ def sqs_raw(
+ self,
+ encoding: Encoding = JsonEncoding(),
+ validator: Validator = NoneValidator()
+ ) -> Callable:
+ """
+ Decorator for sqs raw events (with all metadatas).
+
+ Automatically divide 'Records' and pass the record to main process of Lambda.
+
+ Usage::
+ >>> from jeffy.framework import get_app
+ >>> app = get_app()
+ >>> @app.handlers.sqs_raw()
+ ... def handler(event, context):
+ ... return event['body']
+ """
+ def _sqs_raw(func: Callable): # type: ignore
+ @functools.wraps(func)
+ def wrapper(event, context): # type: ignore
+ ret = []
+ for record in event['Records']:
+ message = encoding.decode(record['body'].encode('utf-8'))
+ validator.validate(message)
+ self.capture_correlation_id(message)
+ record['body'] = message
+ try:
+ self.app.logger.info(message)
+ result = func(record, context)
+ self.app.logger.info(event)
+ ret.append(result)
+ except Exception as e:
+ self.app.logger.exception(e)
+ raise e
+ return ret
+ return wrapper
+ return _sqs_raw
diff --git a/jeffy/handlers/streams.py b/jeffy/handlers/streams.py
index 2bd9af2..bd87259 100644
--- a/jeffy/handlers/streams.py
+++ b/jeffy/handlers/streams.py
@@ -60,9 +60,8 @@ def kinesis_streams(
Usage::
>>> from jeffy.framework import get_app
- >>> from jeffy.encoding.json import JsonEncoding
>>> app = get_app()
- >>> @app.handlers.kinesis_streams(encoding=JsonEncoding())
+ >>> @app.handlers.kinesis_streams()
... def handler(event, context):
... return event['body']['foo']
"""
@@ -86,3 +85,42 @@ def wrapper(event, context): # type: ignore
return ret
return wrapper
return _kinesis_streams
+
+ def kinesis_streams_raw(
+ self,
+ encoding: Encoding = JsonEncoding(),
+ validator: Validator = NoneValidator()
+ ) -> Callable:
+ """
+ Decorator for Kinesis stream raw event (with all metadatas).
+
+ Automatically divide 'Records' and pass the record to main process of Lambda.
+
+ Usage::
+ >>> from jeffy.framework import get_app
+ >>> app = get_app()
+ >>> @app.handlers.kinesis_streams_raw()
+ ... def handler(event, context):
+ ... return event['kinesis']['data']
+ """
+
+ def _kinesis_streams_raw(func: Callable) -> Callable: # type: ignore
+ @functools.wraps(func)
+ def wrapper(event, context): # type: ignore
+ ret = []
+ for record in event['Records']:
+ message = encoding.decode(base64.b64decode(record['kinesis']['data']))
+ validator.validate(message)
+ self.capture_correlation_id(message)
+ record['kinesis']['data'] = message
+ try:
+ self.app.logger.info(message)
+ result = func(record, context)
+ self.app.logger.info(result)
+ ret.append(result)
+ except Exception as e:
+ self.app.logger.exception(e)
+ raise e
+ return ret
+ return wrapper
+ return _kinesis_streams_raw
diff --git a/jeffy/settings.py b/jeffy/settings.py
index 67ed9a4..a68c3ac 100644
--- a/jeffy/settings.py
+++ b/jeffy/settings.py
@@ -22,6 +22,8 @@ def __init__(
----------
logger: logging.Logger
Logger
+ handlers: List[logging.Handler]
+ Logging handlers
log_level: int = logging.INFO
Log level
correlation_attr_name: str = 'correlation_id'
| diff --git a/tests/jeffy/handlers/test_handlers.py b/tests/jeffy/handlers/test_handlers.py
index f62b6f1..375ea70 100644
--- a/tests/jeffy/handlers/test_handlers.py
+++ b/tests/jeffy/handlers/test_handlers.py
@@ -81,6 +81,26 @@ def test_kinesis_streams_error(self, handlers, mocker):
{'Records': [{'kinesis': {'data': base64.b64encode(json.dumps({'bar': 'buz'}).encode('utf-8'))}}]},
None)
+ def test_kinesis_streams_raw(self, handlers, mocker):
+ """It can process kinesis stream events."""
+ mock = mocker.Mock(return_value='foo')
+ kinesis_streams_raw = handlers.kinesis_streams_raw()
+ _kinesis_streams_raw = kinesis_streams_raw(mock)
+ assert _kinesis_streams_raw(
+ {'Records': [{'kinesis': {'data': base64.b64encode(json.dumps({'bar': 'buz'}).encode('utf-8'))}}]},
+ None
+ ) == ['foo']
+
+ def test_kinesis_streams_raw_error(self, handlers, mocker):
+ """It raises a exception."""
+ mock = mocker.Mock(side_effect=Exception('foo'))
+ kinesis_streams_raw = handlers.kinesis_streams_raw()
+ _kinesis_streams_raw = kinesis_streams_raw(mock)
+ with pytest.raises(Exception):
+ _kinesis_streams_raw(
+ {'Records': [{'kinesis': {'data': base64.b64encode(json.dumps({'bar': 'buz'}).encode('utf-8'))}}]},
+ None)
+
def test_sqs(self, handlers, mocker):
"""It can process sqs events."""
mock = mocker.Mock(return_value='foo')
@@ -99,6 +119,24 @@ def test_sqs_error(self, handlers, mocker):
with pytest.raises(Exception):
_sqs({'Records': [{'body': json.dumps({'bar': 'buz'})}]}, None)
+ def test_sqs_raw(self, handlers, mocker):
+ """It can process sqs events."""
+ mock = mocker.Mock(return_value='foo')
+ sqs_raw = handlers.sqs_raw()
+ _sqs_raw = sqs_raw(mock)
+ assert _sqs_raw(
+ {'Records': [{'body': json.dumps({'bar': 'buz'})}]},
+ None
+ ) == ['foo']
+
+ def test_sqs_raw_error(self, handlers, mocker):
+ """It raises a exception."""
+ mock = mocker.Mock(side_effect=Exception('foo'))
+ sqs_raw = handlers.sqs_raw()
+ _sqs_raw = sqs_raw(mock)
+ with pytest.raises(Exception):
+ _sqs_raw({'Records': [{'body': json.dumps({'bar': 'buz'})}]}, None)
+
def test_sns(self, handlers, mocker):
"""It can process sns events."""
mock = mocker.Mock(return_value='foo')
@@ -117,6 +155,24 @@ def test_sns_error(self, handlers, mocker):
with pytest.raises(Exception):
_sns({'Records': [{'Sns': {'Message': json.dumps({'bar': 'buz'})}}]}, None)
+ def test_sns_raw(self, handlers, mocker):
+ """It can process sns events."""
+ mock = mocker.Mock(return_value='foo')
+ sns_raw = handlers.sns_raw()
+ _sns_raw = sns_raw(mock)
+ assert _sns_raw(
+ {'Records': [{'Sns': {'Message': json.dumps({'bar': 'buz'})}}]},
+ None
+ ) == ['foo']
+
+ def test_sns_raw_error(self, handlers, mocker):
+ """It raises a exception."""
+ mock = mocker.Mock(side_effect=Exception('foo'))
+ sns_raw = handlers.sns_raw()
+ _sns_raw = sns_raw(mock)
+ with pytest.raises(Exception):
+ _sns_raw({'Records': [{'Sns': {'Message': json.dumps({'bar': 'buz'})}}]}, None)
+
def test_schedule(self, handlers, mocker):
"""It can process sns events."""
mock = mocker.Mock(return_value='foo')
| [Feature] with_metadata handler for sns, sqs and kinesis_stream
Now, SNS, SQS and Kinesis handlers are only handling the body of message.
However, sometimes we need the other metadata, such as Message ID.
| 2020-07-17T03:28:42 |
|
nephila/djangocms-installer | 378 | nephila__djangocms-installer-378 | ['376'] | ba8d61159a568d3cdcb96084af934737a2b6eed8 | diff --git a/README.rst b/README.rst
index 5863b88f..eb82a43e 100644
--- a/README.rst
+++ b/README.rst
@@ -16,7 +16,7 @@ project.
Refer to `django CMS Tutorial`_ on how to properly setup your first django CMS project.
-.. warning:: Version 2.0 dropped support for Python 2.7, django CMS < 3.7 and Django < 2.2.
+.. warning:: Version 2.0 dropped support for Python 2.7, 3.5, django CMS < 3.7 and Django < 2.2.
More 1.2.x versions may be released after 1.2 is out in case important bugfixes will be needed.
Usage
@@ -68,11 +68,13 @@ Supported versions
The current supported version matrix is the following:
-+----------------+--------------+--------------+
-| | Django 2.2 | Django 3.0 |
-+----------------+--------------+--------------+
-| django CMS 3.7 | Supported | Supported |
-+----------------+--------------+--------------+
++----------------+--------------+--------------+---------------+
+| | Django 2.2 | Django 3.0 | Django 3.1 |
++----------------+--------------+--------------+---------------+
+| django CMS 3.7 | Supported | Supported | Not supported |
++----------------+--------------+--------------+---------------+
+| django CMS 3.8 | Supported | Supported | Supported |
++----------------+--------------+--------------+---------------+
See `version 1.2`_ for older Django / django CMS versions support
@@ -93,7 +95,7 @@ might work, but are not officially supported.
Windows support
---------------
-The installer is tested on Windows 7 with Python versions 3.4.2 and 2.7.8 installed using
+The installer is tested on Windows 10 with Python version 3.8.6 installed using
official MSI packages available at http://python.org.
Please check that the ``.py`` extension is associated correctly with Python interpreter::
diff --git a/changes/376.feature b/changes/376.feature
new file mode 100644
index 00000000..1d8a8297
--- /dev/null
+++ b/changes/376.feature
@@ -0,0 +1,1 @@
+Add support for Django 3.1 / django CMS 3.8
diff --git a/djangocms_installer/config/__init__.py b/djangocms_installer/config/__init__.py
index eae91c22..34629689 100644
--- a/djangocms_installer/config/__init__.py
+++ b/djangocms_installer/config/__init__.py
@@ -433,6 +433,8 @@ def parse(args):
requirements.extend(data.REQUIREMENTS["django-2.2"])
elif django_version == "3.0":
requirements.extend(data.REQUIREMENTS["django-3.0"])
+ elif django_version == "3.1":
+ requirements.extend(data.REQUIREMENTS["django-3.1"])
requirements.extend(data.REQUIREMENTS["default"])
diff --git a/djangocms_installer/config/data.py b/djangocms_installer/config/data.py
index 3401e233..7a1af16f 100644
--- a/djangocms_installer/config/data.py
+++ b/djangocms_installer/config/data.py
@@ -1,4 +1,3 @@
-import sys
import time
bust = {"bust": time.time()}
@@ -22,22 +21,18 @@
DJANGOCMS_RC = "https://github.com/divio/django-cms/archive/release/3.7.x.zip?{bust}".format(**bust)
DJANGOCMS_BETA = DJANGOCMS_RC
DJANGOCMS_37 = "django-cms>=3.7,<3.8"
+DJANGOCMS_38 = "django-cms>=3.8,<3.9"
-DJANGOCMS_SUPPORTED = ("3.7", "stable", "lts", "develop", "rc")
-DJANGOCMS_STABLE = "3.7"
-DJANGOCMS_LTS = "3.7"
+DJANGOCMS_SUPPORTED = ("3.8", "3.7", "stable", "lts", "develop")
+DJANGOCMS_STABLE = "3.8"
+DJANGOCMS_LTS = "3.8"
DJANGOCMS_DEFAULT = DJANGOCMS_STABLE
DJANGO_DEVELOP = "https://github.com/django/django/archive/master.zip?{bust}".format(**bust)
DJANGO_BETA = "https://github.com/django/django/archive/master.zip?{bust}".format(**bust)
-if sys.version_info >= (3, 6):
- DJANGO_SUPPORTED = ("2.2", "3.0", "stable", "lts")
- DJANGO_STABLE = "3.0"
- DJANGO_LTS = "2.2"
-else:
- DJANGO_SUPPORTED = ("2.2", "stable", "lts")
- DJANGO_STABLE = "2.2"
- DJANGO_LTS = "2.2"
+DJANGO_SUPPORTED = ("3.1", "3.0", "2.2", "stable", "lts")
+DJANGO_STABLE = "3.1"
+DJANGO_LTS = "2.2"
DJANGO_DEFAULT = DJANGO_STABLE
@@ -57,12 +52,14 @@
}
VERSION_MATRIX = {
"3.7": ("2.2", "3.0"),
- DJANGOCMS_BETA: ("2.2", "3.0"),
- DJANGOCMS_RC: ("2.2", "3.0"),
- DJANGOCMS_DEVELOP: ("2.2", "3.0"),
+ "3.8": ("2.2", "3.0", "3.1"),
+ DJANGOCMS_BETA: ("2.2", "3.1"),
+ DJANGOCMS_RC: ("2.2", "3.1"),
+ DJANGOCMS_DEVELOP: ("2.2", "3.1"),
}
PACKAGE_MATRIX = {
"3.7": DJANGOCMS_37,
+ "3.8": DJANGOCMS_38,
DJANGOCMS_RC: DJANGOCMS_RC,
DJANGOCMS_BETA: DJANGOCMS_BETA,
DJANGOCMS_DEVELOP: DJANGOCMS_DEVELOP,
@@ -72,22 +69,34 @@
"default": ["html5lib>=1.0.1", "Pillow>=3.0", "six", "pytz"],
"django-2.2": ["django-classy-tags>=0.9", "django-sekizai>=1.0", "django-mptt>0.9"],
"django-3.0": ["django-classy-tags>=0.9", "django-sekizai>=1.0", "django-mptt>0.9"],
- "cms-3.7": ["djangocms-admin-style>=1.5,<1.6", "django-treebeard>=4.0,<5.0"],
+ "django-3.1": ["django-classy-tags>=2.0", "django-sekizai>=2.0", "django-mptt>0.9"],
+ "cms-3.7": ["djangocms-admin-style>=2.0,<3.0", "django-treebeard>=4.0,<5.0"],
+ "cms-3.8": ["djangocms-admin-style>=2.0,<3.0", "django-treebeard>=4.0,<5.0"],
"cms-master": [
"https://github.com/divio/djangocms-admin-style/archive/master.zip?{bust}".format(**bust),
"django-treebeard>=4.0,<5.0",
],
"plugins-3.7": [
- "djangocms-text-ckeditor>=3.7,<4.0",
- "djangocms-link>=2.5,<2.7",
- "djangocms-icon>=1.4,<1.6",
- "djangocms-style>=2.2,<2.4",
- "djangocms-googlemap>=1.3,<1.5",
- "djangocms-snippet>=2.2,<2.4",
- "djangocms-video>=2.1,<2.4",
- "djangocms-file>=2.3,<2.5",
- "djangocms-picture>=2.3,<2.5",
- "djangocms-bootstrap4>=1.5,<1.7",
+ "djangocms-text-ckeditor>=4.0,<5.0",
+ "djangocms-link>=3.0,<4.0",
+ "djangocms-icon>=2.0,<3.0",
+ "djangocms-style>=3.0,<4.0",
+ "djangocms-googlemap>=2.0,<3.0",
+ "djangocms-video>=3.0,<4.0",
+ "djangocms-file>=3.0,<4.0",
+ "djangocms-picture>=3.0,<4.0",
+ "djangocms-bootstrap4>=2.0,<3.0",
+ ],
+ "plugins-3.8": [
+ "djangocms-text-ckeditor>=4.0,<5.0",
+ "djangocms-link>=3.0,<4.0",
+ "djangocms-icon>=2.0,<3.0",
+ "djangocms-style>=3.0,<4.0",
+ "djangocms-googlemap>=2.0,<3.0",
+ "djangocms-video>=3.0,<4.0",
+ "djangocms-file>=3.0,<4.0",
+ "djangocms-picture>=3.0,<4.0",
+ "djangocms-bootstrap4>=2.0,<3.0",
],
"plugins-master": [
"https://github.com/divio/djangocms-text-ckeditor/archive/master.zip?{bust}" "".format(**bust),
@@ -130,7 +139,6 @@
* djangocms-file (File plugin)
* djangocms-picture (Image plugin)
* djangocms-style (Style plugin)
- * djangocms-snippet (Snippet plugin)
* djangocms-googlemap (GoogleMap plugin)
* djangocms-video (Video plugin)
"""
diff --git a/djangocms_installer/config/settings.py b/djangocms_installer/config/settings.py
index 658be1e6..e4d3a5f4 100644
--- a/djangocms_installer/config/settings.py
+++ b/djangocms_installer/config/settings.py
@@ -79,7 +79,6 @@
"djangocms_link",
"djangocms_picture",
"djangocms_style",
- "djangocms_snippet",
"djangocms_googlemap",
"djangocms_video",
)
diff --git a/docs/libraries.rst b/docs/libraries.rst
index a7d4241b..a51d5fcf 100644
--- a/docs/libraries.rst
+++ b/docs/libraries.rst
@@ -17,8 +17,8 @@ Libraries you would want to check:
The actual package name may vary depending on the platform / distribution you
are using; you should make sure you have the library headers file installed
-(mostly contained in package with `-dev` in its name: e.g. `libjpeg-dev` for
-`libjpeg` library).
+(mostly contained in package with ``-dev`` in its name: e.g. ``libjpeg-dev`` for
+``libjpeg`` library).
Examples
^^^^^^^^
diff --git a/docs/reference.rst b/docs/reference.rst
index ccdd2534..2466468d 100644
--- a/docs/reference.rst
+++ b/docs/reference.rst
@@ -50,7 +50,7 @@ The following arguments can be overridden in :ref:`wizard_mode`
.. note:: Django ``stable`` keyword is expanded to latest released Django version supported by django CMS
.. note:: Django ``lts`` keyword is expanded to latest released Django LTS supported by django CMS
.. note:: django-cms ``stable`` keyword is expanded to latest released django-cms version
-.. note:: django-cms ``lts`` keyword is expanded to latest released django-cms LTS version
+.. note:: django-cms ``lts`` keyword is expanded to latest released django-cms LTS version, or latest stable if LTS is not supported
.. warning:: if an unsupported combination of Django and django CMS version is selected, the
wizard exits reporting the error.
diff --git a/setup.cfg b/setup.cfg
index 4b8e093d..ae105c7a 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -48,6 +48,7 @@ classifiers =
Framework :: Django,
Framework :: Django :: 2.2,
Framework :: Django :: 3.0,
+ Framework :: Django :: 3.1,
Topic :: Software Development
[options]
| diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
index 33bc4435..6ef19a0c 100644
--- a/.github/workflows/test.yml
+++ b/.github/workflows/test.yml
@@ -8,7 +8,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
- python-version: [3.8, 3.7, 3.6, 3.5]
+ python-version: [3.9, 3.8, 3.7, 3.6]
steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
diff --git a/tests/base.py b/tests/base.py
index 180ee5c4..96a773aa 100644
--- a/tests/base.py
+++ b/tests/base.py
@@ -123,8 +123,8 @@ def get_stable_django(latest=False, lts=False):
:param lts: Latest lts version
"""
if latest and not sys.version_info < (3, 6) and not lts:
- dj_ver = "3.0"
- match = "Django<3.1"
+ dj_ver = "3.1"
+ match = "Django<3.2"
else:
dj_ver = "2.2"
match = "Django<2.3"
@@ -137,6 +137,6 @@ def get_stable_djangocms():
Takes into account arguments and python version.
"""
- dj_ver = "3.7"
- match = "django-cms<3.8"
+ dj_ver = "3.8"
+ match = "django-cms<3.9"
return dj_ver, match
diff --git a/tests/config.py b/tests/config.py
index 3a27e85e..7638c2f1 100644
--- a/tests/config.py
+++ b/tests/config.py
@@ -26,7 +26,7 @@ def test_default_config(self):
self.assertEqual(conf_data.project_name, "example_prj")
- self.assertEqual(conf_data.cms_version, "3.7")
+ self.assertEqual(conf_data.cms_version, "3.8")
self.assertEqual(conf_data.django_version, dj_version)
self.assertEqual(conf_data.i18n, "yes")
self.assertEqual(conf_data.reversion, "yes")
@@ -361,17 +361,15 @@ def test_latest_version(self):
self.assertEqual(less_than_version("3"), "3.1")
self.assertEqual(less_than_version("3.0.1"), "3.1.1")
- @unittest.skipIf(sys.version_info[0] < 3, reason="django 2+ only supports python 3")
def test_supported_versions(self):
dj_version, dj_match = get_stable_django(latest=True)
- self.assertEqual(supported_versions("stable", "stable"), (dj_version, "3.7"))
+ self.assertEqual(supported_versions("stable", "stable"), (dj_version, "3.8"))
self.assertEqual(supported_versions("stable", "3.1.10"), (dj_version, None))
- self.assertEqual(supported_versions("stable", "rc"), (dj_version, data.DJANGOCMS_RC))
self.assertEqual(supported_versions("stable", "beta"), (dj_version, data.DJANGOCMS_BETA))
self.assertEqual(supported_versions("stable", "develop"), (dj_version, data.DJANGOCMS_DEVELOP))
- self.assertEqual(supported_versions("lts", "rc"), ("2.2", data.DJANGOCMS_RC))
- self.assertEqual(supported_versions("lts", "lts"), ("2.2", "3.7"))
+ self.assertEqual(supported_versions("lts", "stable"), ("2.2", "3.8"))
+ self.assertEqual(supported_versions("lts", "lts"), ("2.2", "3.8"))
with self.assertRaises(RuntimeError):
supported_versions("stable", "2.4"), ("1.5", "2.4")
@@ -405,8 +403,8 @@ def test_requirements(self):
self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_37) > -1)
self.assertTrue(conf_data.requirements.find(dj_match) > -1)
self.assertFalse(conf_data.requirements.find("django-reversion") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=3.7,<4.0") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-admin-style>=1.5") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=4.0") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-admin-style>=2.0") > -1)
self.assertTrue(conf_data.requirements.find("django-filer") > -1)
self.assertTrue(conf_data.requirements.find("cmsplugin-filer") == -1)
self.assertTrue(conf_data.requirements.find("djangocms-file") > -1)
@@ -427,12 +425,12 @@ def test_requirements(self):
]
)
- self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_37) > -1)
+ self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_38) > -1)
self.assertTrue(conf_data.requirements.find(dj_match) > -1)
self.assertFalse(conf_data.requirements.find("django-reversion") > -1)
self.assertTrue(conf_data.requirements.find("cmsplugin-filer") == -1)
self.assertTrue(conf_data.requirements.find("djangocms-admin-style") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=3.7,<4.0") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=4.0") > -1)
self.assertTrue(conf_data.requirements.find("djangocms-bootstrap4") > -1)
self.assertTrue(conf_data.requirements.find("djangocms-file") > -1)
self.assertTrue(conf_data.requirements.find("djangocms-flash") == -1)
@@ -562,7 +560,7 @@ def test_requirements(self):
self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_37) > -1)
self.assertTrue(conf_data.requirements.find(dj_match) > -1)
self.assertFalse(conf_data.requirements.find("django-reversion") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=3.7") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=4.0") > -1)
self.assertTrue(conf_data.requirements.find("djangocms-admin-style") > -1)
self.assertTrue(conf_data.requirements.find("pytz") > -1)
@@ -585,7 +583,7 @@ def test_requirements(self):
self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_37) > -1)
self.assertTrue(conf_data.requirements.find(dj_match) > -1)
self.assertFalse(conf_data.requirements.find("django-reversion") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=3.7") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor>=4.0") > -1)
self.assertTrue(conf_data.requirements.find("djangocms-admin-style") > -1)
self.assertTrue(conf_data.requirements.find("pytz") > -1)
@@ -672,20 +670,17 @@ def test_requirements(self):
"-p" + self.project_dir,
"example_prj",
]
- if sys.version_info < (3, 5,):
- with self.assertRaises(SystemExit):
- conf_data = config.parse(requirements_21)
- else:
- conf_data = config.parse(requirements_21)
-
- self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_DEVELOP) > -1)
- self.assertTrue(conf_data.requirements.find(dj_match) > -1)
- self.assertFalse(conf_data.requirements.find("django-reversion") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor") == -1)
- self.assertTrue(conf_data.requirements.find("djangocms-admin-style/archive/master.zip") > -1)
- self.assertTrue(conf_data.requirements.find("djangocms-teaser/archive/master.zip") == -1)
- self.assertTrue(conf_data.requirements.find("south") == -1)
- self.assertTrue(conf_data.requirements.find("psycopg2") > -1)
+
+ conf_data = config.parse(requirements_21)
+
+ self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_DEVELOP) > -1)
+ self.assertTrue(conf_data.requirements.find(dj_match) > -1)
+ self.assertFalse(conf_data.requirements.find("django-reversion") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-text-ckeditor") == -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-admin-style/archive/master.zip") > -1)
+ self.assertTrue(conf_data.requirements.find("djangocms-teaser/archive/master.zip") == -1)
+ self.assertTrue(conf_data.requirements.find("south") == -1)
+ self.assertTrue(conf_data.requirements.find("psycopg2") > -1)
dj_version, dj_match = get_stable_django(lts=True)
conf_data = config.parse(
@@ -699,7 +694,7 @@ def test_requirements(self):
]
)
- self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_37) > -1)
+ self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_38) > -1)
self.assertTrue(conf_data.requirements.find(dj_match) > -1)
dj_version, dj_match = get_stable_django(latest=True)
@@ -714,7 +709,7 @@ def test_requirements(self):
]
)
- self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_37) > -1)
+ self.assertTrue(conf_data.requirements.find(config.data.DJANGOCMS_38) > -1)
self.assertTrue(conf_data.requirements.find(dj_match) > -1)
def test_bootstrap(self):
@@ -930,7 +925,6 @@ def unused(self, config_data):
if hasattr(config_data, "requirements"):
delattr(config_data, "requirements")
- @unittest.skipIf(sys.version_info[0] < 3, reason="django 2+ only supports python 3")
def test_parse_config_file(self, *args):
"""Tests .config.__init__._parse_config_file function."""
dj_version, __ = get_stable_django(latest=True)
@@ -955,14 +949,14 @@ def test_parse_config_file(self, *args):
("django_version", dj_lts_version),
),
),
- ("config-03.ini", None, (("cms_version", "3.7"), ("i18n", "no"), ("django_version", dj_version),)),
- ("config-04.ini", None, (("cms_version", "3.7"), ("use_timezone", "no"))),
- ("config-05.ini", None, (("cms_version", "3.7"), ("timezone", "Europe/London"))),
- ("config-06.ini", None, (("cms_version", "3.7"), ("reversion", "no"))),
+ ("config-03.ini", None, (("cms_version", "3.8"), ("i18n", "no"), ("django_version", dj_version),)),
+ ("config-04.ini", None, (("cms_version", "3.8"), ("use_timezone", "no"))),
+ ("config-05.ini", None, (("cms_version", "3.8"), ("timezone", "Europe/London"))),
+ ("config-06.ini", None, (("cms_version", "3.8"), ("reversion", "no"))),
(
"config-07.ini",
None,
- (("cms_version", "3.7"), ("permissions", "no"), ("django_version", dj_lts_version)),
+ (("cms_version", "3.8"), ("permissions", "no"), ("django_version", dj_lts_version)),
),
(
"config-08.ini",
@@ -973,14 +967,14 @@ def test_parse_config_file(self, *args):
"config-09.ini",
None,
(
- ("cms_version", "3.7"),
+ ("cms_version", "3.8"),
("i18n", "yes"),
("languages", ["en", "ru"]),
("django_version", dj_lts_version),
),
),
("config-10.ini", "django_version", dj_lts_version),
- ("config-11.ini", "project_directory", "/test/me"),
+ ("config-11.ini", None, (("project_directory", "/test/me"), ("cms_version", "3.7"))),
("config-12.ini", None, (("bootstrap", True), ("django_version", dj_lts_version))),
("config-13.ini", "templates", "."),
("config-14.ini", "starting_page", True),
diff --git a/tests/django.py b/tests/django.py
index dee6b286..c6a2048e 100644
--- a/tests/django.py
+++ b/tests/django.py
@@ -256,19 +256,13 @@ def test_patch_django_22_37(self):
self.assertEqual(len(re.findall("MEDIA_ROOT =", settings)), 1)
self.assertEqual(len(re.findall("STATICFILES_DIRS", settings)), 1)
- @unittest.skipIf(
- sys.version_info[:2] not in ((3, 6), (3, 7), (3, 8),),
- reason="django 3.0 only supports python 3.6, 3.7 and 3.8",
- )
def test_patch_django_30_develop(self):
- dj_version, dj_match = get_stable_django(latest=True)
-
extra_path = os.path.join(os.path.dirname(__file__), "data", "extra_settings.py")
params = [
"--db=sqlite://localhost/test.db",
"--lang=en",
"--extra-settings=%s" % extra_path,
- "--django-version=%s" % dj_version,
+ "--django-version=3.0",
"-f",
"--cms-version=develop",
"--timezone=Europe/Moscow",
@@ -294,7 +288,7 @@ def test_patch_django_30_develop(self):
self.assertFalse(getattr(project.settings, "TEMPLATES_DIR", False))
self.assertTrue(config.get_settings().APPHOOK_RELOAD_MIDDLEWARE_CLASS in project.settings.MIDDLEWARE)
- def test_patch_django_22_rc(self):
+ def test_patch_django_22_38(self):
dj_version, dj_match = get_stable_django(lts=True)
extra_path = os.path.join(os.path.dirname(__file__), "data", "extra_settings.py")
@@ -304,14 +298,48 @@ def test_patch_django_22_rc(self):
"--extra-settings=%s" % extra_path,
"--django-version=%s" % dj_version,
"-f",
- "--cms-version=rc",
+ "--cms-version=3.8",
+ "--timezone=Europe/Moscow",
+ "-q",
+ "-u",
+ "-zno",
+ "--i18n=no",
+ "-p" + self.project_dir,
+ "test_patch_django_22_38",
+ ]
+ config_data = config.parse(params)
+ install.requirements(config_data.requirements)
+ django.create_project(config_data)
+ django.patch_settings(config_data)
+ django.copy_files(config_data)
+ # settings is importable even in non django environment
+ sys.path.append(config_data.project_directory)
+
+ project = __import__(config_data.project_name, globals(), locals(), ["settings"])
+
+ # checking for django options
+ self.assertTrue(project.settings.TEMPLATES)
+ self.assertFalse(getattr(project.settings, "TEMPLATES_DIR", False))
+ self.assertTrue(config.get_settings().APPHOOK_RELOAD_MIDDLEWARE_CLASS in project.settings.MIDDLEWARE)
+
+ def test_patch_django_31_38(self):
+ dj_version, dj_match = get_stable_django(lts=True)
+
+ extra_path = os.path.join(os.path.dirname(__file__), "data", "extra_settings.py")
+ params = [
+ "--db=sqlite://localhost/test.db",
+ "--lang=en",
+ "--extra-settings=%s" % extra_path,
+ "--django-version=3.1",
+ "-f",
+ "--cms-version=3.8",
"--timezone=Europe/Moscow",
"-q",
"-u",
"-zno",
"--i18n=no",
"-p" + self.project_dir,
- "test_patch_django_22_rc",
+ "test_patch_django_22_38",
]
config_data = config.parse(params)
install.requirements(config_data.requirements)
diff --git a/tests/fixtures/configs/config-01.ini b/tests/fixtures/configs/config-01.ini
index 6f8768e5..6cc3f714 100644
--- a/tests/fixtures/configs/config-01.ini
+++ b/tests/fixtures/configs/config-01.ini
@@ -7,7 +7,7 @@ reversion = yes
permissions = yes
languages =
django-version = stable
-cms-version = 3.7
+cms-version = 3.8
parent-dir = .
bootstrap = no
templates = no
diff --git a/tests/main.py b/tests/main.py
index d1f9160f..c197c783 100644
--- a/tests/main.py
+++ b/tests/main.py
@@ -41,7 +41,6 @@ def test_requirements_invocation(self):
self.assertTrue(stdout.find("djangocms-link") > -1)
self.assertTrue(stdout.find("djangocms-picture") > -1)
self.assertTrue(stdout.find("djangocms-style") > -1)
- self.assertTrue(stdout.find("djangocms-snippet") > -1)
self.assertTrue(stdout.find("cmsplugin-filer") == -1)
self.assertTrue(stdout.find("djangocms-teaser") == -1)
self.assertTrue(stdout.find("djangocms-video") > -1)
| Add django CMS 3.8
Add support for django CMS 3.8
| 2020-11-14T22:27:23 |
|
infobloxopen/infoblox-client | 288 | infobloxopen__infoblox-client-288 | ['281'] | 2c9dd6a3aaab8408f0e3131232cbb8da30b1be92 | diff --git a/infoblox_client/exceptions.py b/infoblox_client/exceptions.py
index 61610c6c..2f01c132 100644
--- a/infoblox_client/exceptions.py
+++ b/infoblox_client/exceptions.py
@@ -125,3 +125,8 @@ class InfobloxTimeoutError(InfobloxException):
class InfobloxGridTemporaryUnavailable(InfobloxException):
message = "Cannot perform operation %(operation)s with ref %(ref)s: " \
"%(content)s [code %(code)s]"
+
+
+class InfobloxFetchGotMultipleObjects(BaseExc):
+ message = "Fetch got multiple objects from the API. Unable to " \
+ "deserialize multiple API objects into one InfobloxObject."
diff --git a/infoblox_client/objects.py b/infoblox_client/objects.py
index 2fa636f9..7fa646ea 100644
--- a/infoblox_client/objects.py
+++ b/infoblox_client/objects.py
@@ -383,7 +383,14 @@ def fetch(self, only_ref=False):
"""Fetch object from NIOS by _ref or searchfields
Update existent object with fields returned from NIOS
- Return True on successful object fetch
+
+ Returns:
+ True if object successfully fetched. False otherwise.
+
+ Raises:
+ InfobloxFetchGotMultipleObjects:
+ If fetch got multiple objects from the API and unable to
+ deserialize API response to a single InfobloxObject.
"""
if self.ref:
reply = self.connector.get_object(
@@ -398,6 +405,10 @@ def fetch(self, only_ref=False):
search_dict,
return_fields=return_fields)
if reply:
+ if len(reply) > 1:
+ LOG.debug("Fetch got multiple objects from the API. Reply: %s",
+ reply)
+ raise ib_ex.InfobloxFetchGotMultipleObjects()
self.update_from_dict(reply[0], only_ref=only_ref)
return True
return False
@@ -12740,7 +12751,7 @@ class ARecord(ARecordBase):
'ms_ad_user_data', 'name', 'reclaimable',
'remove_associated_ptr', 'shared_record_group', 'ttl',
'use_ttl', 'view', 'zone']
- _search_for_update_fields = ['ipv4addr', 'view']
+ _search_for_update_fields = ['ipv4addr', 'name', 'view']
_updateable_search_fields = ['comment', 'creator', 'ddns_principal',
'ipv4addr', 'name']
_all_searchable_fields = ['comment', 'creator', 'ddns_principal',
@@ -12826,7 +12837,7 @@ class AAAARecord(ARecordBase):
'ms_ad_user_data', 'name', 'reclaimable',
'remove_associated_ptr', 'shared_record_group', 'ttl',
'use_ttl', 'view', 'zone']
- _search_for_update_fields = ['ipv6addr', 'view']
+ _search_for_update_fields = ['ipv6addr', 'name', 'view']
_updateable_search_fields = ['comment', 'creator', 'ddns_principal']
_all_searchable_fields = ['comment', 'creator', 'ddns_principal',
'ipv6addr', 'name', 'reclaimable', 'view',
diff --git a/tox.ini b/tox.ini
index d029f4b3..f6a45663 100644
--- a/tox.ini
+++ b/tox.ini
@@ -13,7 +13,7 @@ passenv = *
whitelist_externals=mkdir
commands =
mkdir -p {[setup]results}
- nosetests \
+ nosetests tests \
--with-xunit \
--xunit-file {[setup]results}/nose.xml \
--with-coverage \
| diff --git a/e2e_tests/README.md b/e2e_tests/README.md
new file mode 100644
index 00000000..3e1e8f3d
--- /dev/null
+++ b/e2e_tests/README.md
@@ -0,0 +1,19 @@
+# End-to-end tests
+
+This set of tests is intended for functionality validation on an existing WAPI
+instance.
+
+## How to run E2E tests
+
+First, you should export parameters of accessible and running `WAPI` instance.
+Then you can use `unittest` module or any other test runner to run the tests
+in the `e2e_tests` directory.
+
+```bash
+export WAPI_HOST=<WAPI HOST IP> WAPI_USER=<WAPI USERNAME> WAPI_PASS=<WAPI PASSWORD>
+python3 -m unittest e2e_tests.test_objects
+```
+
+## Warning
+
+Please don't run those tests on the production WAPI instance.
diff --git a/e2e_tests/__init__.py b/e2e_tests/__init__.py
new file mode 100644
index 00000000..e69de29b
diff --git a/e2e_tests/connector_facade.py b/e2e_tests/connector_facade.py
new file mode 100644
index 00000000..49941e6c
--- /dev/null
+++ b/e2e_tests/connector_facade.py
@@ -0,0 +1,37 @@
+from collections import deque
+
+from infoblox_client.connector import Connector
+
+
+class E2EConnectorFacade(Connector):
+ """
+ Connector class facade for end-to-end tests.
+
+ This facade will remember all created objects, and then sweep those objects
+ after the test is done.
+ """
+
+ def __init__(self, options):
+ self.__delete_queue = deque()
+ super(E2EConnectorFacade, self).__init__(options)
+
+ def create_object(self, obj_type, payload, return_fields=None):
+ resp = super(E2EConnectorFacade, self).create_object(obj_type,
+ payload,
+ return_fields)
+ self.__delete_queue.append(resp['_ref'])
+ return resp
+
+ def delete_object(self, ref, delete_arguments=None):
+ self.__delete_queue.remove(ref)
+ return super(E2EConnectorFacade, self).delete_object(ref,
+ delete_arguments)
+
+ def sweep_objects(self):
+ """
+ Sweep all objects created by the connector.
+ """
+ while self.__delete_queue:
+ super(E2EConnectorFacade, self).delete_object(
+ self.__delete_queue.pop()
+ )
diff --git a/e2e_tests/test_objects.py b/e2e_tests/test_objects.py
new file mode 100644
index 00000000..ff0ad737
--- /dev/null
+++ b/e2e_tests/test_objects.py
@@ -0,0 +1,69 @@
+import os
+import unittest
+
+from e2e_tests.connector_facade import E2EConnectorFacade
+from infoblox_client.objects import ARecord, DNSZone, AAAARecord
+
+
+class TestObjectsE2E(unittest.TestCase):
+ def setUp(self):
+ opts = {
+ 'host': os.environ['WAPI_HOST'],
+ 'username': os.environ['WAPI_USER'],
+ 'password': os.environ['WAPI_PASS'],
+ }
+
+ self.connector = E2EConnectorFacade(opts)
+
+ def tearDown(self):
+ self.connector.sweep_objects()
+
+ def test_create_alias_a_record(self):
+ """Create two A records with different names, but pointing to the same
+ ipv4addr"""
+ DNSZone.create(self.connector,
+ view='default',
+ fqdn="e2e-test.com")
+
+ alias1, created = ARecord.create_check_exists(
+ self.connector,
+ view='default',
+ ipv4addr="192.168.1.25",
+ name='alias1.e2e-test.com',
+ )
+ self.assertTrue(created)
+
+ alias2, created = ARecord.create_check_exists(
+ self.connector,
+ view='default',
+ ipv4addr="192.168.1.25",
+ name='alias2.e2e-test.com',
+ )
+ self.assertTrue(created)
+ self.assertNotEqual(alias1._ref, alias2._ref)
+
+ def test_create_alias_aaaa_record(self):
+ """Create two AAAA records with different names, but pointing to the
+ same ipv6addr"""
+ DNSZone.create(self.connector,
+ view='default',
+ fqdn="e2e-test.com")
+
+ alias1, created = AAAARecord.create_check_exists(
+ self.connector,
+ view='default',
+ ipv6addr="aaaa:bbbb:cccc:dddd::",
+ name='alias1.e2e-test.com',
+ )
+ self.assertTrue(created)
+
+ alias2, created = AAAARecord.create_check_exists(
+ self.connector,
+ view='default',
+ ipv6addr="aaaa:bbbb:cccc:dddd::",
+ name='alias2.e2e-test.com',
+ )
+ self.assertTrue(created)
+ self.assertNotEqual(alias1._ref, alias2._ref)
+
+
diff --git a/tests/test_object_manager.py b/tests/test_object_manager.py
index a2f83cd9..28635d72 100644
--- a/tests/test_object_manager.py
+++ b/tests/test_object_manager.py
@@ -448,16 +448,13 @@ def test_bind_names_with_a_record(self):
ibom.bind_name_with_record_a(dns_view_name, ip, name,
bind_list, extattrs)
- exp_for_a = {'ipv4addr': ip, 'view': dns_view_name}
+ exp_for_a = {'name': name, 'ipv4addr': ip, 'view': dns_view_name}
exp_for_ptr = {'ptrdname': name, 'view': dns_view_name,
'ipv4addr': ip}
calls = [mock.call('record:a', exp_for_a, return_fields=mock.ANY),
mock.call('record:ptr', exp_for_ptr, return_fields=mock.ANY)]
connector.get_object.assert_has_calls(calls)
- exp_for_a['name'] = name
- exp_for_ptr['ptrdname'] = name
-
create_calls = [mock.call('record:a', exp_for_a, mock.ANY),
mock.call('record:ptr', exp_for_ptr, mock.ANY)]
connector.create_object.assert_has_calls(create_calls)
diff --git a/tests/test_objects.py b/tests/test_objects.py
index 36e7b8e2..217d4a7c 100644
--- a/tests/test_objects.py
+++ b/tests/test_objects.py
@@ -19,6 +19,7 @@
import mock
from infoblox_client import objects
+import infoblox_client.exceptions as ib_ex
REC = 'ZG5zLmJpbmRfbXgkLjQuY29tLm15X3pvbmUuZGVtby5teC5kZW1vLm15X3pvbmUuY29tLjE'
DEFAULT_HOST_RECORD = {
@@ -482,15 +483,40 @@ def test_update_from_dict(self):
self.assertEqual('192.168.1.0/24', net.network)
self.assertEqual(None, net.network_view)
+ def test_create_fails_on_multiple_api_objects(self):
+ """
+ If multiple objects are returned by the API, create should raise
+ exception.
+ """
+ a_records = [{'_ref': 'record:a/Awsdrefsasdwqoijvoriibtrni',
+ 'ip': '192.168.1.52',
+ 'name': 'record1'},
+ {'_ref': 'record:a/Awsdrefsasdwqoijvoriibtrna',
+ 'ip': '192.168.1.52',
+ 'name': 'record2'}]
+
+ connector = self._mock_connector(get_object=a_records)
+
+ with self.assertRaises(ib_ex.InfobloxFetchGotMultipleObjects):
+ objects.ARecordBase.create(connector,
+ ip='192.168.1.52',
+ view='view')
+
+ connector.get_object.assert_called_once_with(
+ 'record:a',
+ {'view': 'view', 'ipv4addr': '192.168.1.52'},
+ return_fields=[])
+
def test_update_fields_on_create(self):
a_record = [{'_ref': 'record:a/Awsdrefsasdwqoijvoriibtrni',
'ip': '192.168.1.52',
- 'name': 'other_name'}]
+ 'name': 'a_record',
+ 'comment': 'test_comment'}]
connector = self._mock_connector(get_object=a_record)
objects.ARecordBase.create(connector,
ip='192.168.1.52',
- name='some-new_name',
view='view',
+ comment='new_test_comment',
update_if_exists=True)
connector.get_object.assert_called_once_with(
'record:a',
@@ -498,18 +524,19 @@ def test_update_fields_on_create(self):
return_fields=[])
connector.update_object.assert_called_once_with(
a_record[0]['_ref'],
- {'name': 'some-new_name', 'ipv4addr': '192.168.1.52'},
+ {'ipv4addr': '192.168.1.52', 'comment': 'new_test_comment'},
mock.ANY)
def test_update_fields_on_create_v6(self):
aaaa_record = [{'_ref': 'record:aaaa/Awsdrefsasdwqoijvoriibtrni',
'ip': '2001:610:240:22::c100:68b',
- 'name': 'other_name'}]
+ 'name': 'aaaa_record',
+ 'comment': "test_comment"}]
connector = self._mock_connector(get_object=aaaa_record)
objects.ARecordBase.create(connector,
ip='2001:610:240:22::c100:68b',
- name='some-new_name',
view='view',
+ comment='new_test_comment',
update_if_exists=True)
connector.get_object.assert_called_once_with(
'record:aaaa',
@@ -517,7 +544,7 @@ def test_update_fields_on_create_v6(self):
return_fields=[])
connector.update_object.assert_called_once_with(
aaaa_record[0]['_ref'],
- {'name': 'some-new_name'},
+ {'comment': 'new_test_comment'},
mock.ANY)
def test_ip_version(self):
| A record create when parent domain doesn't exist false positive
Found this issue while testing. If an A record is submitted for a parent domain that doesn't exist it would normally fail however if the IP address that you are submitting the record for has records that exist then it returns the reference to one of those objects.
Installed version:
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=16.04
DISTRIB_CODENAME=xenial
DISTRIB_DESCRIPTION="Ubuntu 16.04.7 LTS"
infoblox-client: 0.5.0
wapi_version tested: 2.6.1 and 2.9.7
Example:
`
record = objects.ARecord.create(ib_session, name='abc.domaindoesnotexist.com', ipv4addr='10.10.10.1', view='internal')
print(record.ref)
'record:a/ZG5zLmJpbmRfYSQuX2RlZmF1bHQuY29tLm1ldGxpZmUsdTAzNjUwOTAxMDItdmxhbjk4My1ncnAwLDEwLjEwLjEwLjE:abc.domainexists.com/internal'
`
IP **10.10.10.1** already existed in IPAM with an A record object
Domain **domaindoesnotexist.com** does not exist in DNS
Repeating the same process with an IP address that does not contain an A record object will return the expected exception --
`
record = objects.ARecord.create(ib_session, name='abc.domaindoesnotexist.com', ipv4addr='10.10.10.2', view='internal')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "venv/lib/python3.6/site-packages/infoblox_client/objects.py", line 337, in create
**kwargs))
File "venv/lib/python3.6/site-packages/infoblox_client/objects.py", line 317, in create_check_exists
local_obj.return_fields)
File "venv/lib/python3.6/site-packages/infoblox_client/connector.py", line 50, in callee
return func(*args, **kwargs)
File "venv/lib/python3.6/site-packages/infoblox_client/connector.py", line 388, in create_object
code=r.status_code)
infoblox_client.exceptions.InfobloxCannotCreateObject: Cannot create 'record:a' object(s): b'{ "Error": "AdmConDataError: None (IBDataConflictError: IB.Data.Conflict:The action is not allowed. A parent was not found.)", \n "code": "Client.Ibap.Data.Conflict", \n "text": "The action is not allowed. A parent was not found."\n}' [code 400]
`
| 2021-10-12T09:15:57 |
|
infobloxopen/infoblox-client | 295 | infobloxopen__infoblox-client-295 | ['294'] | b6af5306673744ee1de9285ef6b1cd90e0838f6e | diff --git a/infoblox_client/objects.py b/infoblox_client/objects.py
index 6ca7bd1f..3c817b61 100644
--- a/infoblox_client/objects.py
+++ b/infoblox_client/objects.py
@@ -639,7 +639,7 @@ class Dhcpddns(SubObjects):
class Dhcpmember(SubObjects):
- _fields = ['ipv4addr', 'ipv6addr', 'name']
+ _fields = ['_struct', 'ipv4addr', 'ipv6addr', 'name']
class Dhcpoption(SubObjects):
@@ -7533,7 +7533,7 @@ class NetworkV4(Network):
_custom_field_processing = {
'logic_filter_rules': Logicfilterrule.from_dict,
- 'members': Msdhcpserver.from_dict,
+ 'members': Dhcpmember.from_dict,
'options': Dhcpoption.from_dict,
'vlans': Vlanlink.from_dict,
'zone_associations': Zoneassociation.from_dict,
@@ -8518,7 +8518,7 @@ class NetworkTemplateV4(NetworkTemplate):
_custom_field_processing = {
'logic_filter_rules': Logicfilterrule.from_dict,
- 'members': Msdhcpserver.from_dict,
+ 'members': Dhcpmember.from_dict,
'options': Dhcpoption.from_dict,
}
| diff --git a/tests/test_objects.py b/tests/test_objects.py
index 217d4a7c..bf8f4fab 100644
--- a/tests/test_objects.py
+++ b/tests/test_objects.py
@@ -121,6 +121,64 @@ def test_search_network_v6_using_network_field(self):
extattrs=None, force_proxy=False, return_fields=mock.ANY,
max_results=None)
+ def test_search_network_with_grid_dhcp_members(self):
+ found = {
+ '_ref': 'network/ZG5zLm5ldHdvcmskMTAuMC4zMi4wLzI0LzA:10.0.32.0/24/default',
+ 'members': [
+ {'_struct': 'dhcpmember', 'ipv4addr': '192.168.10.67', 'name': 'dhcp01.example.com'},
+ {'_struct': 'dhcpmember', 'ipv4addr': '192.168.11.67', 'name': 'dhcp02.example.com'}
+ ]
+ }
+ connector = self._mock_connector(get_object=[found])
+
+ network = objects.Network.search(
+ connector,
+ network_view='some-view',
+ network='10.0.32.0/24',
+ return_fields=['members']
+ )
+ connector.get_object.assert_called_once_with(
+ 'network',
+ {'network_view': 'some-view', 'network': '10.0.32.0/24'},
+ extattrs=None, force_proxy=False, return_fields=['members'],
+ max_results=None
+ )
+ self.assertIsInstance(network.members[0], objects.Dhcpmember)
+ self.assertIsInstance(network.members[1], objects.Dhcpmember)
+ self.assertEqual('dhcpmember', network.members[0]._struct)
+ self.assertEqual('dhcpmember', network.members[1]._struct)
+ self.assertEqual('192.168.10.67', network.members[0].ipv4addr)
+ self.assertEqual('192.168.11.67', network.members[1].ipv4addr)
+
+ def test_search_network_with_ms_dhcp_members(self):
+ found = {
+ '_ref': 'network/ZG5zLm5ldHdvcmskMTAuMC4zMi4wLzI0LzA:10.0.32.0/24/default',
+ 'members': [
+ {'_struct': 'msdhcpserver', 'ipv4addr': '192.168.10.67', 'name': 'dhcp01.example.com'},
+ {'_struct': 'msdhcpserver', 'ipv4addr': '192.168.11.67', 'name': 'dhcp02.example.com'}
+ ]
+ }
+ connector = self._mock_connector(get_object=[found])
+
+ network = objects.Network.search(
+ connector,
+ network_view='some-view',
+ network='10.0.32.0/24',
+ return_fields=['members']
+ )
+ connector.get_object.assert_called_once_with(
+ 'network',
+ {'network_view': 'some-view', 'network': '10.0.32.0/24'},
+ extattrs=None, force_proxy=False, return_fields=['members'],
+ max_results=None
+ )
+ self.assertIsInstance(network.members[0], objects.Dhcpmember)
+ self.assertIsInstance(network.members[1], objects.Dhcpmember)
+ self.assertEqual('msdhcpserver', network.members[0]._struct)
+ self.assertEqual('msdhcpserver', network.members[1]._struct)
+ self.assertEqual('192.168.10.67', network.members[0].ipv4addr)
+ self.assertEqual('192.168.11.67', network.members[1].ipv4addr)
+
def test_search_network_with_results(self):
found = {"_ref": "network/ZG5zLm5ldHdvcmskMTAuMzkuMTEuMC8yNC8w"
":10.39.11.0/24/default",
| _custom_field_processing bug in NetworkV4 fails to properly serialize members
High-level API call to NetworkV4 fails to serialize members properly. This has been confirmed on the infoblox-client 0.5.0, as well as, _master_. This was tested and confirmed on NIOS v8.5.3 and v8.5.4.
Steps to reproduce:
1. create a DHCP-enabled network 192.168.1.0/24 and associate two (2) Grid DHCP Member servers (NOT Microsoft Servers)
2. perform the following call in the Infoblox client:
`res = objects.Network.search(conn, return_fields=['members'], network='192.168.1.0/24')`
The following is returned:
<pre>
NetworkV4: members="[Msdhcpserver: ipv4addr="192.168.1.5", name="dhcp01.example.com",
Msdhcpserver: ipv4addr="192.168.1.6", name="dhcp02.example.com"]",
_ref="network/ZG5zLm5ldHdvcmskMTAuMC4zMi4wLzI0LzA:192.168.1.0/24/default"
</pre>
There are a few issues with this:
* _struct is missing from the list where the value is either 'msdhcpserver' | 'dhcpmember'
* It's interpreting the members as Msdhcpserver objects 100% of the time instead of using _struct to determine the type of DHCP server that is associated with the network
This issue is also seen with the NetworkTemplate.
This issue is only related to the high-level calls to objects.py and NOT when using the connector low-level calls.
| 2021-12-08T18:18:59 |
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 11