Compare commits

..

45 commits

Author SHA1 Message Date
Inada Naoki
c1ecd23dbf
drop Python 3.9 (#656) 2025-10-09 18:33:23 +09:00
Inada Naoki
af45640970
cython: freethreading_compatible (#654)
```
$ v3/bin/python -VV
Python 3.14.0 free-threading build (main, Oct  7 2025, 15:35:12) [Clang 20.1.4 ]

$ v3/bin/python -c 'import sys,msgpack; print(sys._is_gil_enabled())'
False
```
2025-10-09 15:53:08 +09:00
Inada Naoki
c2546eabc4
update setuptools requirements to >=78.1.1 (#653)
https://github.com/advisories/GHSA-5rjg-fvgr-3xxf
2025-10-09 15:28:01 +09:00
Inada Naoki
ef4f83df16
relax setuptools version (#652) 2025-10-09 13:00:46 +09:00
Inada Naoki
19b5d33ded
release v1.1.2 (#649) 2025-10-08 17:56:20 +09:00
TW
0f3c4be465
README: fix typos and grammar (#648) 2025-10-08 16:10:46 +09:00
MS-GITS
c2a9f1fda5
ci: add support for building windows on arm wheels (#643) 2025-09-26 13:17:17 +09:00
Inada Naoki
d9873dab04
ci: update cibuildwheel and drop Python 3.8 (#642) 2025-07-26 10:59:05 +09:00
Inada Naoki
42f056f3cf v1.1.1 2025-06-13 15:41:08 +09:00
Inada Naoki
e6445d3b92 v1.1.1rc1 2025-06-06 09:56:15 +09:00
Inada Naoki
fe9e620a60
upload to PyPI on create a release (#639) 2025-06-02 14:46:53 +09:00
Inada Naoki
cdc7644503
update cibuildwheel to v2.23.3 (#638) 2025-06-01 16:56:44 +09:00
Inada Naoki
868aa2cd83
update Cython to 3.1.1 (#637) 2025-05-31 12:45:06 +09:00
Edgar RamĂ­rez MondragĂłn
0eeabfb453
Add Python 3.13 trove classifier (#626) 2024-10-08 18:04:56 +09:00
Inada Naoki
4587393b1a
release v1.1.0 (#622) 2024-09-10 01:58:00 +09:00
Thomas A Caswell
20a2b8eaa2
use PyLong_* instead of PyInt_* (#620)
9af421163cb8081414be347038dee7a82b29e8dd in Cython removed back-compatibility `#define`.
2024-08-21 14:56:00 +09:00
Inada Naoki
9d0c7f2f9c
Release v1.1.0rc2 (#619) 2024-08-19 20:36:26 +09:00
Inada Naoki
9e26d80ab2
update cibuildwheel to 2.20.0 (#618) 2024-08-19 17:56:01 +09:00
Inada Naoki
6e11368f5d
update Cython to 3.0.11 (#617) 2024-08-19 17:35:16 +09:00
Inada Naoki
0b1c47b06b
do not install cython as build dependency (#610)
User can not cythonize during `pip install msgpack`.
So remove cython from build dependency.

If user need to use another Cython, user should download sdist, unzip,
manually cythonize, and `pip install .`.
2024-05-07 22:01:54 +09:00
Inada Naoki
9cea8b6da2
Release v1.1.0rc1 (#609) 2024-05-07 20:49:23 +09:00
Inada Naoki
33e0e86f4e
Cleanup code and pyproject (#608)
* use isort
* fallback: use BytesIO instead of StringIO. We had dropped Python 2
already.
2024-05-06 11:46:31 +09:00
Inada Naoki
e0f0e145f1
better error checks (#607)
* check buffer exports
* add error messages
2024-05-06 03:33:48 +09:00
Inada Naoki
e1068087e0
cython: better exception handling (#606)
- use `except -1` instead of manual error handling
- use `PyUnicode_AsUTF8AndSize()`
- use `_pack()` and `_pack_inner()` instead of `while True:`
2024-05-06 02:13:12 +09:00
Inada Naoki
3da5818a3a
update readme (#605) 2024-05-06 02:12:46 +09:00
Inada Naoki
72e65feb0e
packer: add buf_size option (#604)
And change the default buffer size to 256KiB.

Signed-off-by: Rodrigo Tobar <rtobar@icrar.org>
Co-authored-by: Rodrigo Tobar <rtobar@icrar.org>
2024-05-06 00:49:12 +09:00
Inada Naoki
bf2413f915 ignore msgpack/*.c 2024-05-06 00:30:07 +09:00
Inada Naoki
a97b31437d
Remove unused code (#603) 2024-05-06 00:13:59 +09:00
Inada Naoki
52f8bc2e55
implement buffer protocol (#602)
Fix #479
2024-05-05 23:14:27 +09:00
Inada Naoki
526ec9c923
update cibuildwheel to 2.17 (#601) 2024-05-04 16:49:22 +09:00
Inada Naoki
b389ccf2f7
update README (#561) 2024-05-04 16:10:37 +09:00
Inada Naoki
3e9a2a7419
Stop using c++ (#600)
Python 3.13a6+ & C++ & Cython cause compile error on some compilers.
2024-05-04 16:01:48 +09:00
Inada Naoki
0602baf3ea
update Cython and setuptools (#599) 2024-05-03 18:20:09 +09:00
Inada Naoki
2eca765533
use ruff instead of black (#598) 2024-05-03 15:17:54 +09:00
hakan akyĂĽrek
e77672200b
Avoid using floating points during timestamp-datetime conversions (#591) 2024-04-20 07:46:30 +09:00
Inada Naoki
9aedf8ed7f
Release v1.0.8 (#583) 2024-03-01 20:35:28 +09:00
Inada Naoki
bf7bf88ad0
ci: update workflows (#582) 2024-03-01 20:09:55 +09:00
Inada Naoki
039022cecb
update Cython (#581) 2024-03-01 19:24:06 +09:00
Inada Naoki
140864249f
exclude C/Cython files from wheel (#577) 2023-12-20 20:46:04 +09:00
Inada Naoki
c78026102c
doc: use sphinx-rtd-theme (#575) 2023-11-15 23:34:32 +09:00
Inada Naoki
2982e9ff72
release v1.0.7 (#569) 2023-09-28 17:31:52 +09:00
Inada Naoki
acd0684392
do not fallback on build error (#568) 2023-09-28 15:25:10 +09:00
Inada Naoki
ecf03748c7
remove inline macro for msvc (#567) 2023-09-28 15:03:16 +09:00
Inada Naoki
b1b0edaeed
release v1.0.6 (#564) 2023-09-21 14:58:37 +09:00
Inada Naoki
e1d3d5d5c3
update actions (#563) 2023-09-15 12:02:06 +09:00
44 changed files with 705 additions and 928 deletions

View file

@ -10,23 +10,24 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Python - name: Setup Python
uses: actions/setup-python@v4 uses: actions/setup-python@v5
with: with:
python-version: '3.x' python-version: '3.x'
architecture: 'x64' cache: "pip"
cache-dependency-path: |
- name: Checkout requirements.txt
uses: actions/checkout@v3 docs/requirements.txt
- name: Build - name: Build
shell: bash
run: | run: |
pip install -r requirements.txt pip install -r requirements.txt
make cython make cython
pip install .
- name: Sphinx Documentation Generator - name: Sphinx Documentation Generator
run: | run: |
pip install tox pip install -r docs/requirements.txt
tox -e sphinx make docs

View file

@ -1,25 +1,22 @@
name: Black name: lint
on: ["push", "pull_request"] on: ["push", "pull_request"]
jobs: jobs:
black: lint:
# We want to run on external PRs, but not on our own internal PRs as they'll be run # We want to run on external PRs, but not on our own internal PRs as they'll be run
# by the push to the branch. # by the push to the branch.
if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository if: github.event_name == 'push' || github.event.pull_request.head.repo.full_name != github.repository
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.x'
architecture: 'x64'
- name: Checkout - name: Checkout
uses: actions/checkout@v3 uses: actions/checkout@v4
- name: Black Code Formatter - name: ruff check
run: | run: |
pip install black==22.3.0 pipx run ruff check --diff msgpack/ test/ setup.py
black -S --diff --check msgpack/ test/ setup.py
- name: ruff format
run: |
pipx run ruff format --diff msgpack/ test/ setup.py

View file

@ -9,27 +9,33 @@ jobs:
test: test:
strategy: strategy:
matrix: matrix:
os: ["ubuntu-latest", "windows-latest", "macos-latest"] os: ["ubuntu-latest", "windows-latest", "windows-11-arm", "macos-latest"]
py: ["3.12", "3.11", "3.10", "3.9", "3.8"] py: ["3.14", "3.14t", "3.13", "3.12", "3.11", "3.10"]
exclude:
- os: windows-11-arm
py: "3.10"
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
name: Run test with Python ${{ matrix.py }} on ${{ matrix.os }} name: Run test with Python ${{ matrix.py }} on ${{ matrix.os }}
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v3 uses: actions/checkout@v5
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v4 uses: actions/setup-python@v6
with: with:
python-version: ${{ matrix.py }} python-version: ${{ matrix.py }}
allow-prereleases: true allow-prereleases: true
cache: "pip" cache: "pip"
- name: Prepare
shell: bash
run: |
python -m pip install -r requirements.txt pytest
- name: Build - name: Build
shell: bash shell: bash
run: | run: |
pip install -r requirements.txt pytest
make cython make cython
pip install . pip install .
@ -42,3 +48,14 @@ jobs:
shell: bash shell: bash
run: | run: |
MSGPACK_PUREPYTHON=1 pytest -v test MSGPACK_PUREPYTHON=1 pytest -v test
- name: build packages
shell: bash
run: |
python -m build -nv
- name: upload packages
uses: actions/upload-artifact@v4
with:
name: dist-${{ matrix.os }}-${{ matrix.py }}
path: dist

View file

@ -1,50 +1,88 @@
name: Build Wheels name: Build sdist and Wheels
on: on:
push: push:
branches: [main] branches: [main]
create: release:
types:
- published
workflow_dispatch:
jobs: jobs:
build_wheels: build_wheels:
strategy: strategy:
matrix: matrix:
os: ["ubuntu-latest", "windows-latest", "macos-latest"] # macos-13 is for intel
os: ["ubuntu-24.04", "ubuntu-24.04-arm", "windows-latest", "windows-11-arm", "macos-13", "macos-latest"]
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
name: Build wheels on ${{ matrix.os }} name: Build wheels on ${{ matrix.os }}
steps: steps:
- name: Checkout - uses: actions/checkout@v5
uses: actions/checkout@v3 - uses: actions/setup-python@v6
- name: Set up QEMU
if: runner.os == 'Linux'
uses: docker/setup-qemu-action@v1
with:
platforms: arm64
- name: Set up Python 3.x
uses: actions/setup-python@v4
with: with:
python-version: "3.x" python-version: "3.x"
cache: "pip" cache: "pip"
- name: Cythonize
- name: Prepare
shell: bash shell: bash
run: | run: |
pip install -r requirements.txt pip install -r requirements.txt
make cython make cython
- name: Build - name: Build
uses: pypa/cibuildwheel@v2.15.0 uses: pypa/cibuildwheel@v3.2.0
env: env:
CIBW_TEST_REQUIRES: "pytest" CIBW_TEST_REQUIRES: "pytest"
CIBW_TEST_COMMAND: "pytest {package}/test" CIBW_TEST_COMMAND: "pytest {package}/test"
CIBW_ARCHS_LINUX: auto aarch64 CIBW_SKIP: "pp* cp38-* cp39-* cp310-win_arm64"
CIBW_ARCHS_MACOS: x86_64 universal2 arm64
CIBW_SKIP: pp* - name: Build sdist
if: runner.os == 'Linux' && runner.arch == 'X64'
run: |
pip install build
python -m build -s -o wheelhouse
- name: Upload Wheels to artifact - name: Upload Wheels to artifact
uses: actions/upload-artifact@v1 uses: actions/upload-artifact@v4
with: with:
name: Wheels name: wheels-${{ matrix.os }}
path: wheelhouse path: wheelhouse
# combine all wheels into one artifact
combine_wheels:
needs: [build_wheels]
runs-on: ubuntu-latest
steps:
- uses: actions/download-artifact@v4
with:
# unpacks all CIBW artifacts into dist/
pattern: wheels-*
path: dist
merge-multiple: true
- name: Upload Wheels to artifact
uses: actions/upload-artifact@v4
with:
name: wheels-all
path: dist
# https://github.com/pypa/cibuildwheel/blob/main/examples/github-deploy.yml
upload_pypi:
needs: [build_wheels]
runs-on: ubuntu-latest
environment: pypi
permissions:
id-token: write
if: github.event_name == 'release' && github.event.action == 'published'
# or, alternatively, upload to PyPI on every tag starting with 'v' (remove on: release above to use this)
# if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags/v')
steps:
- uses: actions/download-artifact@v4
with:
# unpacks all CIBW artifacts into dist/
pattern: wheels-*
path: dist
merge-multiple: true
- uses: pypa/gh-action-pypi-publish@release/v1
#with:
# To test: repository-url: https://test.pypi.org/legacy/

2
.gitignore vendored
View file

@ -2,11 +2,13 @@ MANIFEST
build/* build/*
dist/* dist/*
.tox .tox
.python-version
*.pyc *.pyc
*.pyo *.pyo
*.so *.so
*~ *~
msgpack/__version__.py msgpack/__version__.py
msgpack/*.c
msgpack/*.cpp msgpack/*.cpp
*.egg-info *.egg-info
/venv /venv

View file

@ -18,6 +18,7 @@ python:
install: install:
- method: pip - method: pip
path: . path: .
- requirements: docs/requirements.txt
sphinx: sphinx:
configuration: docs/conf.py configuration: docs/conf.py

View file

@ -1,7 +1,91 @@
1.0.6rc1 1.1.2
=====
Release Date: 2025-10-08
This release does not change source code. It updates only building wheels:
* Update Cython to v3.1.4
* Update cibuildwheel to v3.2.0
* Drop Python 3.8
* Add Python 3.14
* Add windows-arm
1.1.1
=====
Release Date: 2025-06-13
* No change from 1.1.1rc1.
1.1.1rc1
======== ========
Release Date: 2023-09-13 Release Date: 2025-06-06
* Update Cython to 3.1.1 and cibuildwheel to 2.23.3.
1.1.0
=====
Release Date: 2024-09-10
* use ``PyLong_*`` instead of ``PyInt_*`` for compatibility with
future Cython. (#620)
1.1.0rc2
========
Release Date: 2024-08-19
* Update Cython to 3.0.11 for better Python 3.13 support.
* Update cibuildwheel to 2.20.0 to build Python 3.13 wheels.
1.1.0rc1
========
Release Date: 2024-05-07
* Update Cython to 3.0.10 to reduce C warnings and future support for Python 3.13.
* Stop using C++ mode in Cython to reduce compile error on some compilers.
* ``Packer()`` has ``buf_size`` option to specify initial size of
internal buffer to reduce reallocation.
* The default internal buffer size of ``Packer()`` is reduced from
1MiB to 256KiB to optimize for common use cases. Use ``buf_size``
if you are packing large data.
* ``Timestamp.to_datetime()`` and ``Timestamp.from_datetime()`` become
more accurate by avoiding floating point calculations. (#591)
* The Cython code for ``Unpacker`` has been slightly rewritten for maintainability.
* The fallback implementation of ``Packer()`` and ``Unpacker()`` now uses keyword-only
arguments to improve compatibility with the Cython implementation.
1.0.8
=====
Release Date: 2024-03-01
* Update Cython to 3.0.8. This fixes memory leak when iterating
``Unpacker`` object on Python 3.12.
* Do not include C/Cython files in binary wheels.
1.0.7
=====
Release Date: 2023-09-28
* Fix build error of extension module on Windows. (#567)
* ``setup.py`` doesn't skip build error of extension module. (#568)
1.0.6
=====
Release Date: 2023-09-21
.. note::
v1.0.6 Wheels for Windows don't contain extension module.
Please upgrade to v1.0.7 or newer.
* Add Python 3.12 wheels (#517) * Add Python 3.12 wheels (#517)
* Remove Python 2.7, 3.6, and 3.7 support * Remove Python 2.7, 3.6, and 3.7 support
@ -107,7 +191,7 @@ Important changes
* unpacker: Default value of input limits are smaller than before to avoid DoS attack. * unpacker: Default value of input limits are smaller than before to avoid DoS attack.
If you need to handle large data, you need to specify limits manually. (#319) If you need to handle large data, you need to specify limits manually. (#319)
* Unpacker doesn't wrap underlaying ``ValueError`` (including ``UnicodeError``) into * Unpacker doesn't wrap underlying ``ValueError`` (including ``UnicodeError``) into
``UnpackValueError``. If you want to catch all exception during unpack, you need ``UnpackValueError``. If you want to catch all exception during unpack, you need
to use ``try ... except Exception`` with minimum try code block. (#323, #233) to use ``try ... except Exception`` with minimum try code block. (#323, #233)

View file

@ -1,13 +1,5 @@
# Developer's note # Developer's note
## Wheels
Wheels for macOS and Linux are built on Travis and AppVeyr, in
[methane/msgpack-wheels](https://github.com/methane/msgpack-wheels) repository.
Wheels for Windows are built on Github Actions in this repository.
### Build ### Build
``` ```

View file

@ -1,5 +1,5 @@
include setup.py include setup.py
include COPYING include COPYING
include README.md include README.md
recursive-include msgpack *.h *.c *.pyx *.cpp recursive-include msgpack *.h *.c *.pyx
recursive-include test *.py recursive-include test *.py

View file

@ -4,9 +4,17 @@ PYTHON_SOURCES = msgpack test setup.py
all: cython all: cython
python setup.py build_ext -i -f python setup.py build_ext -i -f
.PHONY: black .PHONY: format
black: format:
black $(PYTHON_SOURCES) ruff format $(PYTHON_SOURCES)
.PHONY: lint
lint:
ruff check $(PYTHON_SOURCES)
.PHONY: doc
doc:
cd docs && sphinx-build -n -v -W --keep-going -b html -d doctrees . html
.PHONY: pyupgrade .PHONY: pyupgrade
pyupgrade: pyupgrade:
@ -14,7 +22,7 @@ pyupgrade:
.PHONY: cython .PHONY: cython
cython: cython:
cython --cplus msgpack/_cmsgpack.pyx cython msgpack/_cmsgpack.pyx
.PHONY: test .PHONY: test
test: cython test: cython

169
README.md
View file

@ -3,60 +3,13 @@
[![Build Status](https://github.com/msgpack/msgpack-python/actions/workflows/wheel.yml/badge.svg)](https://github.com/msgpack/msgpack-python/actions/workflows/wheel.yml) [![Build Status](https://github.com/msgpack/msgpack-python/actions/workflows/wheel.yml/badge.svg)](https://github.com/msgpack/msgpack-python/actions/workflows/wheel.yml)
[![Documentation Status](https://readthedocs.org/projects/msgpack-python/badge/?version=latest)](https://msgpack-python.readthedocs.io/en/latest/?badge=latest) [![Documentation Status](https://readthedocs.org/projects/msgpack-python/badge/?version=latest)](https://msgpack-python.readthedocs.io/en/latest/?badge=latest)
## What's this ## What is this?
[MessagePack](https://msgpack.org/) is an efficient binary serialization format. [MessagePack](https://msgpack.org/) is an efficient binary serialization format.
It lets you exchange data among multiple languages like JSON. It lets you exchange data among multiple languages like JSON.
But it's faster and smaller. But it's faster and smaller.
This package provides CPython bindings for reading and writing MessagePack data. This package provides CPython bindings for reading and writing MessagePack data.
## Very important notes for existing users
### PyPI package name
Package name on PyPI was changed from `msgpack-python` to `msgpack` from 0.5.
When upgrading from msgpack-0.4 or earlier, do `pip uninstall msgpack-python` before
`pip install -U msgpack`.
### Compatibility with the old format
You can use `use_bin_type=False` option to pack `bytes`
object into raw type in the old msgpack spec, instead of bin type in new msgpack spec.
You can unpack old msgpack format using `raw=True` option.
It unpacks str (raw) type in msgpack into Python bytes.
See note below for detail.
### Major breaking changes in msgpack 1.0
* Python 2
* The extension module does not support Python 2 anymore.
The pure Python implementation (`msgpack.fallback`) is used for Python 2.
* Packer
* `use_bin_type=True` by default. bytes are encoded in bin type in msgpack.
**If you are still using Python 2, you must use unicode for all string types.**
You can use `use_bin_type=False` to encode into old msgpack format.
* `encoding` option is removed. UTF-8 is used always.
* Unpacker
* `raw=False` by default. It assumes str types are valid UTF-8 string
and decode them to Python str (unicode) object.
* `encoding` option is removed. You can use `raw=True` to support old format.
* Default value of `max_buffer_size` is changed from 0 to 100 MiB.
* Default value of `strict_map_key` is changed to True to avoid hashdos.
You need to pass `strict_map_key=False` if you have data which contain map keys
which type is not bytes or str.
## Install ## Install
``` ```
@ -65,55 +18,38 @@ $ pip install msgpack
### Pure Python implementation ### Pure Python implementation
The extension module in msgpack (`msgpack._cmsgpack`) does not support The extension module in msgpack (`msgpack._cmsgpack`) does not support PyPy.
Python 2 and PyPy.
But msgpack provides a pure Python implementation (`msgpack.fallback`)
for PyPy and Python 2.
But msgpack provides a pure Python implementation (`msgpack.fallback`) for PyPy.
### Windows ### Windows
When you can't use a binary distribution, you need to install Visual Studio If you can't use a binary distribution, you need to install Visual Studio
or Windows SDK on Windows. or the Windows SDK on Windows.
Without extension, using pure Python implementation on CPython runs slowly. Without the extension, the pure Python implementation on CPython runs slowly.
## How to use ## How to use
NOTE: In examples below, I use `raw=False` and `use_bin_type=True` for users
using msgpack < 1.0. These options are default from msgpack 1.0 so you can omit them.
### One-shot pack & unpack ### One-shot pack & unpack
Use `packb` for packing and `unpackb` for unpacking. Use `packb` for packing and `unpackb` for unpacking.
msgpack provides `dumps` and `loads` as an alias for compatibility with msgpack provides `dumps` and `loads` as aliases for compatibility with
`json` and `pickle`. `json` and `pickle`.
`pack` and `dump` packs to a file-like object. `pack` and `dump` pack to a file-like object.
`unpack` and `load` unpacks from a file-like object. `unpack` and `load` unpack from a file-like object.
```pycon ```pycon
>>> import msgpack >>> import msgpack
>>> msgpack.packb([1, 2, 3], use_bin_type=True) >>> msgpack.packb([1, 2, 3])
'\x93\x01\x02\x03' '\x93\x01\x02\x03'
>>> msgpack.unpackb(_, raw=False) >>> msgpack.unpackb(_)
[1, 2, 3] [1, 2, 3]
``` ```
`unpack` unpacks msgpack's array to Python's list, but can also unpack to tuple: Read the docstring for options.
```pycon
>>> msgpack.unpackb(b'\x93\x01\x02\x03', use_list=False, raw=False)
(1, 2, 3)
```
You should always specify the `use_list` keyword argument for backward compatibility.
See performance issues relating to `use_list option`_ below.
Read the docstring for other options.
### Streaming unpacking ### Streaming unpacking
@ -127,17 +63,17 @@ from io import BytesIO
buf = BytesIO() buf = BytesIO()
for i in range(100): for i in range(100):
buf.write(msgpack.packb(i, use_bin_type=True)) buf.write(msgpack.packb(i))
buf.seek(0) buf.seek(0)
unpacker = msgpack.Unpacker(buf, raw=False) unpacker = msgpack.Unpacker(buf)
for unpacked in unpacker: for unpacked in unpacker:
print(unpacked) print(unpacked)
``` ```
### Packing/unpacking of custom data type ### Packing/unpacking of custom data types
It is also possible to pack/unpack custom data types. Here is an example for It is also possible to pack/unpack custom data types. Here is an example for
`datetime.datetime`. `datetime.datetime`.
@ -162,14 +98,17 @@ def encode_datetime(obj):
return obj return obj
packed_dict = msgpack.packb(useful_dict, default=encode_datetime, use_bin_type=True) packed_dict = msgpack.packb(useful_dict, default=encode_datetime)
this_dict_again = msgpack.unpackb(packed_dict, object_hook=decode_datetime, raw=False) this_dict_again = msgpack.unpackb(packed_dict, object_hook=decode_datetime)
``` ```
`Unpacker`'s `object_hook` callback receives a dict; the `Unpacker`'s `object_hook` callback receives a dict; the
`object_pairs_hook` callback may instead be used to receive a list of `object_pairs_hook` callback may instead be used to receive a list of
key-value pairs. key-value pairs.
NOTE: msgpack can encode datetime with tzinfo into standard ext type for now.
See `datetime` option in `Packer` docstring.
### Extended types ### Extended types
@ -191,8 +130,8 @@ It is also possible to pack/unpack custom data types using the **ext** type.
... return ExtType(code, data) ... return ExtType(code, data)
... ...
>>> data = array.array('d', [1.2, 3.4]) >>> data = array.array('d', [1.2, 3.4])
>>> packed = msgpack.packb(data, default=default, use_bin_type=True) >>> packed = msgpack.packb(data, default=default)
>>> unpacked = msgpack.unpackb(packed, ext_hook=ext_hook, raw=False) >>> unpacked = msgpack.unpackb(packed, ext_hook=ext_hook)
>>> data == unpacked >>> data == unpacked
True True
``` ```
@ -201,8 +140,8 @@ True
### Advanced unpacking control ### Advanced unpacking control
As an alternative to iteration, `Unpacker` objects provide `unpack`, As an alternative to iteration, `Unpacker` objects provide `unpack`,
`skip`, `read_array_header` and `read_map_header` methods. The former two `skip`, `read_array_header`, and `read_map_header` methods. The former two
read an entire message from the stream, respectively de-serialising and returning read an entire message from the stream, respectively deserializing and returning
the result, or ignoring it. The latter two methods return the number of elements the result, or ignoring it. The latter two methods return the number of elements
in the upcoming container, so that each element in an array, or key-value pair in the upcoming container, so that each element in an array, or key-value pair
in a map, can be unpacked or skipped individually. in a map, can be unpacked or skipped individually.
@ -210,7 +149,7 @@ in a map, can be unpacked or skipped individually.
## Notes ## Notes
### string and binary type ### String and binary types in the old MessagePack spec
Early versions of msgpack didn't distinguish string and binary types. Early versions of msgpack didn't distinguish string and binary types.
The type for representing both string and binary types was named **raw**. The type for representing both string and binary types was named **raw**.
@ -228,7 +167,7 @@ and `raw=True` options.
### ext type ### ext type
To use the **ext** type, pass `msgpack.ExtType` object to packer. To use the **ext** type, pass a `msgpack.ExtType` object to the packer.
```pycon ```pycon
>>> import msgpack >>> import msgpack
@ -242,24 +181,62 @@ You can use it with `default` and `ext_hook`. See below.
### Security ### Security
To unpacking data received from unreliable source, msgpack provides When unpacking data received from an unreliable source, msgpack provides
two security options. two security options.
`max_buffer_size` (default: `100*1024*1024`) limits the internal buffer size. `max_buffer_size` (default: `100*1024*1024`) limits the internal buffer size.
It is used to limit the preallocated list size too. It is also used to limit preallocated list sizes.
`strict_map_key` (default: `True`) limits the type of map keys to bytes and str. `strict_map_key` (default: `True`) limits the type of map keys to bytes and str.
While msgpack spec doesn't limit the types of the map keys, While the MessagePack spec doesn't limit map key types,
there is a risk of the hashdos. there is a risk of a hash DoS.
If you need to support other types for map keys, use `strict_map_key=False`. If you need to support other types for map keys, use `strict_map_key=False`.
### Performance tips ### Performance tips
CPython's GC starts when growing allocated object. CPython's GC starts when the number of allocated objects grows.
This means unpacking may cause useless GC. This means unpacking may trigger unnecessary GC.
You can use `gc.disable()` when unpacking large message. You can use `gc.disable()` when unpacking a large message.
List is the default sequence type of Python. A list is the default sequence type in Python.
But tuple is lighter than list. However, a tuple is lighter than a list.
You can use `use_list=False` while unpacking when performance is important. You can use `use_list=False` while unpacking when performance is important.
## Major breaking changes in the history
### msgpack 0.5
The package name on PyPI was changed from `msgpack-python` to `msgpack` in 0.5.
When upgrading from msgpack-0.4 or earlier, do `pip uninstall msgpack-python` before
`pip install -U msgpack`.
### msgpack 1.0
* Python 2 support
* The extension module no longer supports Python 2.
The pure Python implementation (`msgpack.fallback`) is used for Python 2.
* msgpack 1.0.6 drops official support of Python 2.7, as pip and
GitHub Action "setup-python" no longer supports Python 2.7.
* Packer
* Packer uses `use_bin_type=True` by default.
Bytes are encoded in the bin type in MessagePack.
* The `encoding` option is removed. UTF-8 is always used.
* Unpacker
* Unpacker uses `raw=False` by default. It assumes str values are valid UTF-8 strings
and decodes them to Python str (Unicode) objects.
* `encoding` option is removed. You can use `raw=True` to support old format (e.g. unpack into bytes, not str).
* The default value of `max_buffer_size` is changed from 0 to 100 MiB to avoid DoS attacks.
You need to pass `max_buffer_size=0` if you have large but safe data.
* The default value of `strict_map_key` is changed to True to avoid hash DoS.
You need to pass `strict_map_key=False` if you have data that contain map keys
whose type is neither bytes nor str.

View file

@ -1,5 +1,3 @@
# -*- coding: utf-8 -*-
#
# msgpack documentation build configuration file, created by # msgpack documentation build configuration file, created by
# sphinx-quickstart on Sun Feb 24 14:20:50 2013. # sphinx-quickstart on Sun Feb 24 14:20:50 2013.
# #
@ -91,7 +89,7 @@ pygments_style = "sphinx"
# The theme to use for HTML and HTML Help pages. See the documentation for # The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes. # a list of builtin themes.
html_theme = "sphinxdoc" html_theme = "sphinx_rtd_theme"
# Theme options are theme-specific and customize the look and feel of a theme # Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the # further. For a list of options available for each theme, see the

2
docs/requirements.txt Normal file
View file

@ -0,0 +1,2 @@
sphinx~=7.3.7
sphinx-rtd-theme~=2.0.0

View file

@ -1,20 +1,20 @@
from .exceptions import * # ruff: noqa: F401
from .ext import ExtType, Timestamp
import os import os
from .exceptions import * # noqa: F403
from .ext import ExtType, Timestamp
version = (1, 0, 6, "rc", 1) version = (1, 1, 2)
__version__ = "1.0.6rc1" __version__ = "1.1.2"
if os.environ.get("MSGPACK_PUREPYTHON"): if os.environ.get("MSGPACK_PUREPYTHON"):
from .fallback import Packer, unpackb, Unpacker from .fallback import Packer, Unpacker, unpackb
else: else:
try: try:
from ._cmsgpack import Packer, unpackb, Unpacker from ._cmsgpack import Packer, Unpacker, unpackb
except ImportError: except ImportError:
from .fallback import Packer, unpackb, Unpacker from .fallback import Packer, Unpacker, unpackb
def pack(o, stream, **kwargs): def pack(o, stream, **kwargs):

View file

@ -1,5 +1,6 @@
# coding: utf-8
#cython: embedsignature=True, c_string_encoding=ascii, language_level=3 #cython: embedsignature=True, c_string_encoding=ascii, language_level=3
#cython: freethreading_compatible = True
import cython
from cpython.datetime cimport import_datetime, datetime_new from cpython.datetime cimport import_datetime, datetime_new
import_datetime() import_datetime()

View file

@ -1,5 +1,3 @@
# coding: utf-8
from cpython cimport * from cpython cimport *
from cpython.bytearray cimport PyByteArray_Check, PyByteArray_CheckExact from cpython.bytearray cimport PyByteArray_Check, PyByteArray_CheckExact
from cpython.datetime cimport ( from cpython.datetime cimport (
@ -16,8 +14,6 @@ from .ext import ExtType, Timestamp
cdef extern from "Python.h": cdef extern from "Python.h":
int PyMemoryView_Check(object obj) int PyMemoryView_Check(object obj)
char* PyUnicode_AsUTF8AndSize(object obj, Py_ssize_t *l) except NULL
cdef extern from "pack.h": cdef extern from "pack.h":
struct msgpack_packer: struct msgpack_packer:
@ -26,26 +22,21 @@ cdef extern from "pack.h":
size_t buf_size size_t buf_size
bint use_bin_type bint use_bin_type
int msgpack_pack_int(msgpack_packer* pk, int d) int msgpack_pack_nil(msgpack_packer* pk) except -1
int msgpack_pack_nil(msgpack_packer* pk) int msgpack_pack_true(msgpack_packer* pk) except -1
int msgpack_pack_true(msgpack_packer* pk) int msgpack_pack_false(msgpack_packer* pk) except -1
int msgpack_pack_false(msgpack_packer* pk) int msgpack_pack_long_long(msgpack_packer* pk, long long d) except -1
int msgpack_pack_long(msgpack_packer* pk, long d) int msgpack_pack_unsigned_long_long(msgpack_packer* pk, unsigned long long d) except -1
int msgpack_pack_long_long(msgpack_packer* pk, long long d) int msgpack_pack_float(msgpack_packer* pk, float d) except -1
int msgpack_pack_unsigned_long_long(msgpack_packer* pk, unsigned long long d) int msgpack_pack_double(msgpack_packer* pk, double d) except -1
int msgpack_pack_float(msgpack_packer* pk, float d) int msgpack_pack_array(msgpack_packer* pk, size_t l) except -1
int msgpack_pack_double(msgpack_packer* pk, double d) int msgpack_pack_map(msgpack_packer* pk, size_t l) except -1
int msgpack_pack_array(msgpack_packer* pk, size_t l) int msgpack_pack_raw(msgpack_packer* pk, size_t l) except -1
int msgpack_pack_map(msgpack_packer* pk, size_t l) int msgpack_pack_bin(msgpack_packer* pk, size_t l) except -1
int msgpack_pack_raw(msgpack_packer* pk, size_t l) int msgpack_pack_raw_body(msgpack_packer* pk, char* body, size_t l) except -1
int msgpack_pack_bin(msgpack_packer* pk, size_t l) int msgpack_pack_ext(msgpack_packer* pk, char typecode, size_t l) except -1
int msgpack_pack_raw_body(msgpack_packer* pk, char* body, size_t l) int msgpack_pack_timestamp(msgpack_packer* x, long long seconds, unsigned long nanoseconds) except -1
int msgpack_pack_ext(msgpack_packer* pk, char typecode, size_t l)
int msgpack_pack_timestamp(msgpack_packer* x, long long seconds, unsigned long nanoseconds);
int msgpack_pack_unicode(msgpack_packer* pk, object o, long long limit)
cdef extern from "buff_converter.h":
object buff_to_buff(char *, Py_ssize_t)
cdef int DEFAULT_RECURSE_LIMIT=511 cdef int DEFAULT_RECURSE_LIMIT=511
cdef long long ITEM_LIMIT = (2**32)-1 cdef long long ITEM_LIMIT = (2**32)-1
@ -59,7 +50,7 @@ cdef inline int PyBytesLike_CheckExact(object o):
return PyBytes_CheckExact(o) or PyByteArray_CheckExact(o) return PyBytes_CheckExact(o) or PyByteArray_CheckExact(o)
cdef class Packer(object): cdef class Packer:
""" """
MessagePack Packer MessagePack Packer
@ -103,27 +94,44 @@ cdef class Packer(object):
:param str unicode_errors: :param str unicode_errors:
The error handler for encoding unicode. (default: 'strict') The error handler for encoding unicode. (default: 'strict')
DO NOT USE THIS!! This option is kept for very specific usage. DO NOT USE THIS!! This option is kept for very specific usage.
:param int buf_size:
The size of the internal buffer. (default: 256*1024)
Useful if serialisation size can be correctly estimated,
avoid unnecessary reallocations.
""" """
cdef msgpack_packer pk cdef msgpack_packer pk
cdef object _default cdef object _default
cdef object _berrors cdef object _berrors
cdef const char *unicode_errors cdef const char *unicode_errors
cdef size_t exports # number of exported buffers
cdef bint strict_types cdef bint strict_types
cdef bint use_float cdef bint use_float
cdef bint autoreset cdef bint autoreset
cdef bint datetime cdef bint datetime
def __cinit__(self): def __cinit__(self, buf_size=256*1024, **_kwargs):
cdef int buf_size = 1024*1024
self.pk.buf = <char*> PyMem_Malloc(buf_size) self.pk.buf = <char*> PyMem_Malloc(buf_size)
if self.pk.buf == NULL: if self.pk.buf == NULL:
raise MemoryError("Unable to allocate internal buffer.") raise MemoryError("Unable to allocate internal buffer.")
self.pk.buf_size = buf_size self.pk.buf_size = buf_size
self.pk.length = 0 self.pk.length = 0
self.exports = 0
def __dealloc__(self):
PyMem_Free(self.pk.buf)
self.pk.buf = NULL
assert self.exports == 0
cdef _check_exports(self):
if self.exports > 0:
raise BufferError("Existing exports of data: Packer cannot be changed")
@cython.critical_section
def __init__(self, *, default=None, def __init__(self, *, default=None,
bint use_single_float=False, bint autoreset=True, bint use_bin_type=True, bint use_single_float=False, bint autoreset=True, bint use_bin_type=True,
bint strict_types=False, bint datetime=False, unicode_errors=None): bint strict_types=False, bint datetime=False, unicode_errors=None,
buf_size=256*1024):
self.use_float = use_single_float self.use_float = use_single_float
self.strict_types = strict_types self.strict_types = strict_types
self.autoreset = autoreset self.autoreset = autoreset
@ -140,139 +148,97 @@ cdef class Packer(object):
else: else:
self.unicode_errors = self._berrors self.unicode_errors = self._berrors
def __dealloc__(self): # returns -2 when default should(o) be called
PyMem_Free(self.pk.buf) cdef int _pack_inner(self, object o, bint will_default, int nest_limit) except -1:
self.pk.buf = NULL
cdef int _pack(self, object o, int nest_limit=DEFAULT_RECURSE_LIMIT) except -1:
cdef long long llval cdef long long llval
cdef unsigned long long ullval cdef unsigned long long ullval
cdef unsigned long ulval cdef unsigned long ulval
cdef long longval cdef const char* rawval
cdef float fval
cdef double dval
cdef char* rawval
cdef int ret
cdef dict d
cdef Py_ssize_t L cdef Py_ssize_t L
cdef int default_used = 0
cdef bint strict_types = self.strict_types
cdef Py_buffer view cdef Py_buffer view
cdef bint strict = self.strict_types
if nest_limit < 0:
raise ValueError("recursion limit exceeded.")
while True:
if o is None: if o is None:
ret = msgpack_pack_nil(&self.pk) msgpack_pack_nil(&self.pk)
elif o is True: elif o is True:
ret = msgpack_pack_true(&self.pk) msgpack_pack_true(&self.pk)
elif o is False: elif o is False:
ret = msgpack_pack_false(&self.pk) msgpack_pack_false(&self.pk)
elif PyLong_CheckExact(o) if strict_types else PyLong_Check(o): elif PyLong_CheckExact(o) if strict else PyLong_Check(o):
# PyInt_Check(long) is True for Python 3.
# So we should test long before int.
try: try:
if o > 0: if o > 0:
ullval = o ullval = o
ret = msgpack_pack_unsigned_long_long(&self.pk, ullval) msgpack_pack_unsigned_long_long(&self.pk, ullval)
else: else:
llval = o llval = o
ret = msgpack_pack_long_long(&self.pk, llval) msgpack_pack_long_long(&self.pk, llval)
except OverflowError as oe: except OverflowError as oe:
if not default_used and self._default is not None: if will_default:
o = self._default(o) return -2
default_used = True
continue
else: else:
raise OverflowError("Integer value out of range") raise OverflowError("Integer value out of range")
elif PyInt_CheckExact(o) if strict_types else PyInt_Check(o): elif PyFloat_CheckExact(o) if strict else PyFloat_Check(o):
longval = o
ret = msgpack_pack_long(&self.pk, longval)
elif PyFloat_CheckExact(o) if strict_types else PyFloat_Check(o):
if self.use_float: if self.use_float:
fval = o msgpack_pack_float(&self.pk, <float>o)
ret = msgpack_pack_float(&self.pk, fval)
else: else:
dval = o msgpack_pack_double(&self.pk, <double>o)
ret = msgpack_pack_double(&self.pk, dval) elif PyBytesLike_CheckExact(o) if strict else PyBytesLike_Check(o):
elif PyBytesLike_CheckExact(o) if strict_types else PyBytesLike_Check(o):
L = Py_SIZE(o) L = Py_SIZE(o)
if L > ITEM_LIMIT: if L > ITEM_LIMIT:
PyErr_Format(ValueError, b"%.200s object is too large", Py_TYPE(o).tp_name) PyErr_Format(ValueError, b"%.200s object is too large", Py_TYPE(o).tp_name)
rawval = o rawval = o
ret = msgpack_pack_bin(&self.pk, L) msgpack_pack_bin(&self.pk, L)
if ret == 0: msgpack_pack_raw_body(&self.pk, rawval, L)
ret = msgpack_pack_raw_body(&self.pk, rawval, L) elif PyUnicode_CheckExact(o) if strict else PyUnicode_Check(o):
elif PyUnicode_CheckExact(o) if strict_types else PyUnicode_Check(o):
if self.unicode_errors == NULL: if self.unicode_errors == NULL:
ret = msgpack_pack_unicode(&self.pk, o, ITEM_LIMIT); rawval = PyUnicode_AsUTF8AndSize(o, &L)
if ret == -2: if L >ITEM_LIMIT:
raise ValueError("unicode string is too large") raise ValueError("unicode string is too large")
else: else:
o = PyUnicode_AsEncodedString(o, NULL, self.unicode_errors) o = PyUnicode_AsEncodedString(o, NULL, self.unicode_errors)
L = Py_SIZE(o) L = Py_SIZE(o)
if L > ITEM_LIMIT: if L > ITEM_LIMIT:
raise ValueError("unicode string is too large") raise ValueError("unicode string is too large")
ret = msgpack_pack_raw(&self.pk, L)
if ret == 0:
rawval = o rawval = o
ret = msgpack_pack_raw_body(&self.pk, rawval, L) msgpack_pack_raw(&self.pk, L)
elif PyDict_CheckExact(o): msgpack_pack_raw_body(&self.pk, rawval, L)
d = <dict>o elif PyDict_CheckExact(o) if strict else PyDict_Check(o):
L = len(d)
if L > ITEM_LIMIT:
raise ValueError("dict is too large")
ret = msgpack_pack_map(&self.pk, L)
if ret == 0:
for k, v in d.items():
ret = self._pack(k, nest_limit-1)
if ret != 0: break
ret = self._pack(v, nest_limit-1)
if ret != 0: break
elif not strict_types and PyDict_Check(o):
L = len(o) L = len(o)
if L > ITEM_LIMIT: if L > ITEM_LIMIT:
raise ValueError("dict is too large") raise ValueError("dict is too large")
ret = msgpack_pack_map(&self.pk, L) msgpack_pack_map(&self.pk, L)
if ret == 0:
for k, v in o.items(): for k, v in o.items():
ret = self._pack(k, nest_limit-1) self._pack(k, nest_limit)
if ret != 0: break self._pack(v, nest_limit)
ret = self._pack(v, nest_limit-1) elif type(o) is ExtType if strict else isinstance(o, ExtType):
if ret != 0: break
elif type(o) is ExtType if strict_types else isinstance(o, ExtType):
# This should be before Tuple because ExtType is namedtuple. # This should be before Tuple because ExtType is namedtuple.
longval = o.code
rawval = o.data rawval = o.data
L = len(o.data) L = len(o.data)
if L > ITEM_LIMIT: if L > ITEM_LIMIT:
raise ValueError("EXT data is too large") raise ValueError("EXT data is too large")
ret = msgpack_pack_ext(&self.pk, longval, L) msgpack_pack_ext(&self.pk, <long>o.code, L)
ret = msgpack_pack_raw_body(&self.pk, rawval, L) msgpack_pack_raw_body(&self.pk, rawval, L)
elif type(o) is Timestamp: elif type(o) is Timestamp:
llval = o.seconds llval = o.seconds
ulval = o.nanoseconds ulval = o.nanoseconds
ret = msgpack_pack_timestamp(&self.pk, llval, ulval) msgpack_pack_timestamp(&self.pk, llval, ulval)
elif PyList_CheckExact(o) if strict_types else (PyTuple_Check(o) or PyList_Check(o)): elif PyList_CheckExact(o) if strict else (PyTuple_Check(o) or PyList_Check(o)):
L = Py_SIZE(o) L = Py_SIZE(o)
if L > ITEM_LIMIT: if L > ITEM_LIMIT:
raise ValueError("list is too large") raise ValueError("list is too large")
ret = msgpack_pack_array(&self.pk, L) msgpack_pack_array(&self.pk, L)
if ret == 0:
for v in o: for v in o:
ret = self._pack(v, nest_limit-1) self._pack(v, nest_limit)
if ret != 0: break
elif PyMemoryView_Check(o): elif PyMemoryView_Check(o):
if PyObject_GetBuffer(o, &view, PyBUF_SIMPLE) != 0: PyObject_GetBuffer(o, &view, PyBUF_SIMPLE)
raise ValueError("could not get buffer for memoryview")
L = view.len L = view.len
if L > ITEM_LIMIT: if L > ITEM_LIMIT:
PyBuffer_Release(&view); PyBuffer_Release(&view);
raise ValueError("memoryview is too large") raise ValueError("memoryview is too large")
ret = msgpack_pack_bin(&self.pk, L) try:
if ret == 0: msgpack_pack_bin(&self.pk, L)
ret = msgpack_pack_raw_body(&self.pk, <char*>view.buf, L) msgpack_pack_raw_body(&self.pk, <char*>view.buf, L)
finally:
PyBuffer_Release(&view); PyBuffer_Release(&view);
elif self.datetime and PyDateTime_CheckExact(o) and datetime_tzinfo(o) is not None: elif self.datetime and PyDateTime_CheckExact(o) and datetime_tzinfo(o) is not None:
delta = o - epoch delta = o - epoch
@ -280,19 +246,32 @@ cdef class Packer(object):
raise ValueError("failed to calculate delta") raise ValueError("failed to calculate delta")
llval = timedelta_days(delta) * <long long>(24*60*60) + timedelta_seconds(delta) llval = timedelta_days(delta) * <long long>(24*60*60) + timedelta_seconds(delta)
ulval = timedelta_microseconds(delta) * 1000 ulval = timedelta_microseconds(delta) * 1000
ret = msgpack_pack_timestamp(&self.pk, llval, ulval) msgpack_pack_timestamp(&self.pk, llval, ulval)
elif not default_used and self._default: elif will_default:
o = self._default(o) return -2
default_used = 1
continue
elif self.datetime and PyDateTime_CheckExact(o): elif self.datetime and PyDateTime_CheckExact(o):
# this should be later than will_default
PyErr_Format(ValueError, b"can not serialize '%.200s' object where tzinfo=None", Py_TYPE(o).tp_name) PyErr_Format(ValueError, b"can not serialize '%.200s' object where tzinfo=None", Py_TYPE(o).tp_name)
else: else:
PyErr_Format(TypeError, b"can not serialize '%.200s' object", Py_TYPE(o).tp_name) PyErr_Format(TypeError, b"can not serialize '%.200s' object", Py_TYPE(o).tp_name)
return ret
cpdef pack(self, object obj): cdef int _pack(self, object o, int nest_limit=DEFAULT_RECURSE_LIMIT) except -1:
cdef int ret cdef int ret
if nest_limit < 0:
raise ValueError("recursion limit exceeded.")
nest_limit -= 1
if self._default is not None:
ret = self._pack_inner(o, 1, nest_limit)
if ret == -2:
o = self._default(o)
else:
return ret
return self._pack_inner(o, 0, nest_limit)
@cython.critical_section
def pack(self, object obj):
cdef int ret
self._check_exports()
try: try:
ret = self._pack(obj, DEFAULT_RECURSE_LIMIT) ret = self._pack(obj, DEFAULT_RECURSE_LIMIT)
except: except:
@ -305,36 +284,37 @@ cdef class Packer(object):
self.pk.length = 0 self.pk.length = 0
return buf return buf
@cython.critical_section
def pack_ext_type(self, typecode, data): def pack_ext_type(self, typecode, data):
self._check_exports()
if len(data) > ITEM_LIMIT:
raise ValueError("ext data too large")
msgpack_pack_ext(&self.pk, typecode, len(data)) msgpack_pack_ext(&self.pk, typecode, len(data))
msgpack_pack_raw_body(&self.pk, data, len(data)) msgpack_pack_raw_body(&self.pk, data, len(data))
@cython.critical_section
def pack_array_header(self, long long size): def pack_array_header(self, long long size):
self._check_exports()
if size > ITEM_LIMIT: if size > ITEM_LIMIT:
raise ValueError raise ValueError("array too large")
cdef int ret = msgpack_pack_array(&self.pk, size) msgpack_pack_array(&self.pk, size)
if ret == -1:
raise MemoryError
elif ret: # should not happen
raise TypeError
if self.autoreset: if self.autoreset:
buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length)
self.pk.length = 0 self.pk.length = 0
return buf return buf
@cython.critical_section
def pack_map_header(self, long long size): def pack_map_header(self, long long size):
self._check_exports()
if size > ITEM_LIMIT: if size > ITEM_LIMIT:
raise ValueError raise ValueError("map too learge")
cdef int ret = msgpack_pack_map(&self.pk, size) msgpack_pack_map(&self.pk, size)
if ret == -1:
raise MemoryError
elif ret: # should not happen
raise TypeError
if self.autoreset: if self.autoreset:
buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length)
self.pk.length = 0 self.pk.length = 0
return buf return buf
@cython.critical_section
def pack_map_pairs(self, object pairs): def pack_map_pairs(self, object pairs):
""" """
Pack *pairs* as msgpack map type. Pack *pairs* as msgpack map type.
@ -342,33 +322,43 @@ cdef class Packer(object):
*pairs* should be a sequence of pairs. *pairs* should be a sequence of pairs.
(`len(pairs)` and `for k, v in pairs:` should be supported.) (`len(pairs)` and `for k, v in pairs:` should be supported.)
""" """
cdef int ret = msgpack_pack_map(&self.pk, len(pairs)) self._check_exports()
if ret == 0: size = len(pairs)
if size > ITEM_LIMIT:
raise ValueError("map too large")
msgpack_pack_map(&self.pk, size)
for k, v in pairs: for k, v in pairs:
ret = self._pack(k) self._pack(k)
if ret != 0: break self._pack(v)
ret = self._pack(v)
if ret != 0: break
if ret == -1:
raise MemoryError
elif ret: # should not happen
raise TypeError
if self.autoreset: if self.autoreset:
buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) buf = PyBytes_FromStringAndSize(self.pk.buf, self.pk.length)
self.pk.length = 0 self.pk.length = 0
return buf return buf
@cython.critical_section
def reset(self): def reset(self):
"""Reset internal buffer. """Reset internal buffer.
This method is useful only when autoreset=False. This method is useful only when autoreset=False.
""" """
self._check_exports()
self.pk.length = 0 self.pk.length = 0
@cython.critical_section
def bytes(self): def bytes(self):
"""Return internal buffer contents as bytes object""" """Return internal buffer contents as bytes object"""
return PyBytes_FromStringAndSize(self.pk.buf, self.pk.length) return PyBytes_FromStringAndSize(self.pk.buf, self.pk.length)
def getbuffer(self): def getbuffer(self):
"""Return view of internal buffer.""" """Return memoryview of internal buffer.
return buff_to_buff(self.pk.buf, self.pk.length)
Note: Packer now supports buffer protocol. You can use memoryview(packer).
"""
return memoryview(self)
def __getbuffer__(self, Py_buffer *buffer, int flags):
PyBuffer_FillInfo(buffer, self, self.pk.buf, self.pk.length, 1, flags)
self.exports += 1
def __releasebuffer__(self, Py_buffer *buffer):
self.exports -= 1

View file

@ -1,5 +1,3 @@
# coding: utf-8
from cpython cimport * from cpython cimport *
cdef extern from "Python.h": cdef extern from "Python.h":
ctypedef struct PyObject ctypedef struct PyObject
@ -35,7 +33,7 @@ cdef extern from "unpack.h":
PyObject* timestamp_t PyObject* timestamp_t
PyObject *giga; PyObject *giga;
PyObject *utc; PyObject *utc;
char *unicode_errors const char *unicode_errors
Py_ssize_t max_str_len Py_ssize_t max_str_len
Py_ssize_t max_bin_len Py_ssize_t max_bin_len
Py_ssize_t max_array_len Py_ssize_t max_array_len
@ -210,7 +208,7 @@ def unpackb(object packed, *, object object_hook=None, object list_hook=None,
raise ValueError("Unpack failed: error = %d" % (ret,)) raise ValueError("Unpack failed: error = %d" % (ret,))
cdef class Unpacker(object): cdef class Unpacker:
"""Streaming unpacker. """Streaming unpacker.
Arguments: Arguments:
@ -324,6 +322,7 @@ cdef class Unpacker(object):
PyMem_Free(self.buf) PyMem_Free(self.buf)
self.buf = NULL self.buf = NULL
@cython.critical_section
def __init__(self, file_like=None, *, Py_ssize_t read_size=0, def __init__(self, file_like=None, *, Py_ssize_t read_size=0,
bint use_list=True, bint raw=False, int timestamp=0, bint strict_map_key=True, bint use_list=True, bint raw=False, int timestamp=0, bint strict_map_key=True,
object object_hook=None, object object_pairs_hook=None, object list_hook=None, object object_hook=None, object object_pairs_hook=None, object list_hook=None,
@ -384,6 +383,7 @@ cdef class Unpacker(object):
max_str_len, max_bin_len, max_array_len, max_str_len, max_bin_len, max_array_len,
max_map_len, max_ext_len) max_map_len, max_ext_len)
@cython.critical_section
def feed(self, object next_bytes): def feed(self, object next_bytes):
"""Append `next_bytes` to internal buffer.""" """Append `next_bytes` to internal buffer."""
cdef Py_buffer pybuff cdef Py_buffer pybuff
@ -484,6 +484,7 @@ cdef class Unpacker(object):
else: else:
raise ValueError("Unpack failed: error = %d" % (ret,)) raise ValueError("Unpack failed: error = %d" % (ret,))
@cython.critical_section
def read_bytes(self, Py_ssize_t nbytes): def read_bytes(self, Py_ssize_t nbytes):
"""Read a specified number of raw bytes from the stream""" """Read a specified number of raw bytes from the stream"""
cdef Py_ssize_t nread cdef Py_ssize_t nread
@ -496,6 +497,7 @@ cdef class Unpacker(object):
self.stream_offset += nread self.stream_offset += nread
return ret return ret
@cython.critical_section
def unpack(self): def unpack(self):
"""Unpack one object """Unpack one object
@ -503,6 +505,7 @@ cdef class Unpacker(object):
""" """
return self._unpack(unpack_construct) return self._unpack(unpack_construct)
@cython.critical_section
def skip(self): def skip(self):
"""Read and ignore one object, returning None """Read and ignore one object, returning None
@ -510,6 +513,7 @@ cdef class Unpacker(object):
""" """
return self._unpack(unpack_skip) return self._unpack(unpack_skip)
@cython.critical_section
def read_array_header(self): def read_array_header(self):
"""assuming the next object is an array, return its size n, such that """assuming the next object is an array, return its size n, such that
the next n unpack() calls will iterate over its contents. the next n unpack() calls will iterate over its contents.
@ -518,6 +522,7 @@ cdef class Unpacker(object):
""" """
return self._unpack(read_array_header) return self._unpack(read_array_header)
@cython.critical_section
def read_map_header(self): def read_map_header(self):
"""assuming the next object is a map, return its size n, such that the """assuming the next object is a map, return its size n, such that the
next n * 2 unpack() calls will iterate over its key-value pairs. next n * 2 unpack() calls will iterate over its key-value pairs.
@ -526,6 +531,7 @@ cdef class Unpacker(object):
""" """
return self._unpack(read_map_header) return self._unpack(read_map_header)
@cython.critical_section
def tell(self): def tell(self):
"""Returns the current position of the Unpacker in bytes, i.e., the """Returns the current position of the Unpacker in bytes, i.e., the
number of bytes that were read from the input, also the starting number of bytes that were read from the input, also the starting
@ -536,6 +542,7 @@ cdef class Unpacker(object):
def __iter__(self): def __iter__(self):
return self return self
@cython.critical_section
def __next__(self): def __next__(self):
return self._unpack(unpack_construct, 1) return self._unpack(unpack_construct, 1)

View file

@ -1,8 +0,0 @@
#include "Python.h"
/* cython does not support this preprocessor check => write it in raw C */
static PyObject *
buff_to_buff(char *buff, Py_ssize_t size)
{
return PyMemoryView_FromMemory(buff, size, PyBUF_READ);
}

View file

@ -1,6 +1,6 @@
from collections import namedtuple
import datetime import datetime
import struct import struct
from collections import namedtuple
class ExtType(namedtuple("ExtType", "code data")): class ExtType(namedtuple("ExtType", "code data")):
@ -157,7 +157,9 @@ class Timestamp:
:rtype: `datetime.datetime` :rtype: `datetime.datetime`
""" """
utc = datetime.timezone.utc utc = datetime.timezone.utc
return datetime.datetime.fromtimestamp(0, utc) + datetime.timedelta(seconds=self.to_unix()) return datetime.datetime.fromtimestamp(0, utc) + datetime.timedelta(
seconds=self.seconds, microseconds=self.nanoseconds // 1000
)
@staticmethod @staticmethod
def from_datetime(dt): def from_datetime(dt):
@ -165,4 +167,4 @@ class Timestamp:
:rtype: Timestamp :rtype: Timestamp
""" """
return Timestamp.from_unix(dt.timestamp()) return Timestamp(seconds=int(dt.timestamp()), nanoseconds=dt.microsecond * 1000)

View file

@ -1,27 +1,22 @@
"""Fallback pure Python implementation of msgpack""" """Fallback pure Python implementation of msgpack"""
from datetime import datetime as _DateTime
import sys
import struct
import struct
import sys
from datetime import datetime as _DateTime
if hasattr(sys, "pypy_version_info"): if hasattr(sys, "pypy_version_info"):
# StringIO is slow on PyPy, StringIO is faster. However: PyPy's own
# StringBuilder is fastest.
from __pypy__ import newlist_hint from __pypy__ import newlist_hint
from __pypy__.builders import BytesBuilder
try: _USING_STRINGBUILDER = True
from __pypy__.builders import BytesBuilder as StringBuilder
except ImportError:
from __pypy__.builders import StringBuilder
USING_STRINGBUILDER = True
class StringIO: class BytesIO:
def __init__(self, s=b""): def __init__(self, s=b""):
if s: if s:
self.builder = StringBuilder(len(s)) self.builder = BytesBuilder(len(s))
self.builder.append(s) self.builder.append(s)
else: else:
self.builder = StringBuilder() self.builder = BytesBuilder()
def write(self, s): def write(self, s):
if isinstance(s, memoryview): if isinstance(s, memoryview):
@ -34,17 +29,17 @@ if hasattr(sys, "pypy_version_info"):
return self.builder.build() return self.builder.build()
else: else:
USING_STRINGBUILDER = False from io import BytesIO
from io import BytesIO as StringIO
newlist_hint = lambda size: [] _USING_STRINGBUILDER = False
def newlist_hint(size):
return []
from .exceptions import BufferFull, OutOfData, ExtraData, FormatError, StackError from .exceptions import BufferFull, ExtraData, FormatError, OutOfData, StackError
from .ext import ExtType, Timestamp from .ext import ExtType, Timestamp
EX_SKIP = 0 EX_SKIP = 0
EX_CONSTRUCT = 1 EX_CONSTRUCT = 1
EX_READ_ARRAY_HEADER = 2 EX_READ_ARRAY_HEADER = 2
@ -231,6 +226,7 @@ class Unpacker:
def __init__( def __init__(
self, self,
file_like=None, file_like=None,
*,
read_size=0, read_size=0,
use_list=True, use_list=True,
raw=False, raw=False,
@ -333,6 +329,7 @@ class Unpacker:
# Use extend here: INPLACE_ADD += doesn't reliably typecast memoryview in jython # Use extend here: INPLACE_ADD += doesn't reliably typecast memoryview in jython
self._buffer.extend(view) self._buffer.extend(view)
view.release()
def _consume(self): def _consume(self):
"""Gets rid of the used parts of the buffer.""" """Gets rid of the used parts of the buffer."""
@ -649,32 +646,13 @@ class Packer:
The error handler for encoding unicode. (default: 'strict') The error handler for encoding unicode. (default: 'strict')
DO NOT USE THIS!! This option is kept for very specific usage. DO NOT USE THIS!! This option is kept for very specific usage.
Example of streaming deserialize from file-like object:: :param int buf_size:
Internal buffer size. This option is used only for C implementation.
unpacker = Unpacker(file_like)
for o in unpacker:
process(o)
Example of streaming deserialize from socket::
unpacker = Unpacker()
while True:
buf = sock.recv(1024**2)
if not buf:
break
unpacker.feed(buf)
for o in unpacker:
process(o)
Raises ``ExtraData`` when *packed* contains extra bytes.
Raises ``OutOfData`` when *packed* is incomplete.
Raises ``FormatError`` when *packed* is not valid msgpack.
Raises ``StackError`` when *packed* contains too nested.
Other exceptions can be raised during unpacking.
""" """
def __init__( def __init__(
self, self,
*,
default=None, default=None,
use_single_float=False, use_single_float=False,
autoreset=True, autoreset=True,
@ -682,16 +660,16 @@ class Packer:
strict_types=False, strict_types=False,
datetime=False, datetime=False,
unicode_errors=None, unicode_errors=None,
buf_size=None,
): ):
self._strict_types = strict_types self._strict_types = strict_types
self._use_float = use_single_float self._use_float = use_single_float
self._autoreset = autoreset self._autoreset = autoreset
self._use_bin_type = use_bin_type self._use_bin_type = use_bin_type
self._buffer = StringIO() self._buffer = BytesIO()
self._datetime = bool(datetime) self._datetime = bool(datetime)
self._unicode_errors = unicode_errors or "strict" self._unicode_errors = unicode_errors or "strict"
if default is not None: if default is not None and not callable(default):
if not callable(default):
raise TypeError("default must be callable") raise TypeError("default must be callable")
self._default = default self._default = default
@ -823,18 +801,18 @@ class Packer:
try: try:
self._pack(obj) self._pack(obj)
except: except:
self._buffer = StringIO() # force reset self._buffer = BytesIO() # force reset
raise raise
if self._autoreset: if self._autoreset:
ret = self._buffer.getvalue() ret = self._buffer.getvalue()
self._buffer = StringIO() self._buffer = BytesIO()
return ret return ret
def pack_map_pairs(self, pairs): def pack_map_pairs(self, pairs):
self._pack_map_pairs(len(pairs), pairs) self._pack_map_pairs(len(pairs), pairs)
if self._autoreset: if self._autoreset:
ret = self._buffer.getvalue() ret = self._buffer.getvalue()
self._buffer = StringIO() self._buffer = BytesIO()
return ret return ret
def pack_array_header(self, n): def pack_array_header(self, n):
@ -843,7 +821,7 @@ class Packer:
self._pack_array_header(n) self._pack_array_header(n)
if self._autoreset: if self._autoreset:
ret = self._buffer.getvalue() ret = self._buffer.getvalue()
self._buffer = StringIO() self._buffer = BytesIO()
return ret return ret
def pack_map_header(self, n): def pack_map_header(self, n):
@ -852,7 +830,7 @@ class Packer:
self._pack_map_header(n) self._pack_map_header(n)
if self._autoreset: if self._autoreset:
ret = self._buffer.getvalue() ret = self._buffer.getvalue()
self._buffer = StringIO() self._buffer = BytesIO()
return ret return ret
def pack_ext_type(self, typecode, data): def pack_ext_type(self, typecode, data):
@ -941,11 +919,11 @@ class Packer:
This method is useful only when autoreset=False. This method is useful only when autoreset=False.
""" """
self._buffer = StringIO() self._buffer = BytesIO()
def getbuffer(self): def getbuffer(self):
"""Return view of internal buffer.""" """Return view of internal buffer."""
if USING_STRINGBUILDER: if _USING_STRINGBUILDER:
return memoryview(self.bytes()) return memoryview(self.bytes())
else: else:
return self._buffer.getbuffer() return self._buffer.getbuffer()

View file

@ -21,15 +21,12 @@
#include "sysdep.h" #include "sysdep.h"
#include <limits.h> #include <limits.h>
#include <string.h> #include <string.h>
#include <stdbool.h>
#ifdef __cplusplus #ifdef __cplusplus
extern "C" { extern "C" {
#endif #endif
#ifdef _MSC_VER
#define inline __inline
#endif
typedef struct msgpack_packer { typedef struct msgpack_packer {
char *buf; char *buf;
size_t length; size_t length;
@ -67,27 +64,6 @@ static inline int msgpack_pack_write(msgpack_packer* pk, const char *data, size_
#include "pack_template.h" #include "pack_template.h"
// return -2 when o is too long
static inline int
msgpack_pack_unicode(msgpack_packer *pk, PyObject *o, long long limit)
{
assert(PyUnicode_Check(o));
Py_ssize_t len;
const char* buf = PyUnicode_AsUTF8AndSize(o, &len);
if (buf == NULL)
return -1;
if (len > limit) {
return -2;
}
int ret = msgpack_pack_raw(pk, len);
if (ret) return ret;
return msgpack_pack_raw_body(pk, buf, len);
}
#ifdef __cplusplus #ifdef __cplusplus
} }
#endif #endif

View file

@ -37,18 +37,6 @@
* Integer * Integer
*/ */
#define msgpack_pack_real_uint8(x, d) \
do { \
if(d < (1<<7)) { \
/* fixnum */ \
msgpack_pack_append_buffer(x, &TAKE8_8(d), 1); \
} else { \
/* unsigned 8 */ \
unsigned char buf[2] = {0xcc, TAKE8_8(d)}; \
msgpack_pack_append_buffer(x, buf, 2); \
} \
} while(0)
#define msgpack_pack_real_uint16(x, d) \ #define msgpack_pack_real_uint16(x, d) \
do { \ do { \
if(d < (1<<7)) { \ if(d < (1<<7)) { \
@ -123,18 +111,6 @@ do { \
} \ } \
} while(0) } while(0)
#define msgpack_pack_real_int8(x, d) \
do { \
if(d < -(1<<5)) { \
/* signed 8 */ \
unsigned char buf[2] = {0xd0, TAKE8_8(d)}; \
msgpack_pack_append_buffer(x, buf, 2); \
} else { \
/* fixnum */ \
msgpack_pack_append_buffer(x, &TAKE8_8(d), 1); \
} \
} while(0)
#define msgpack_pack_real_int16(x, d) \ #define msgpack_pack_real_int16(x, d) \
do { \ do { \
if(d < -(1<<5)) { \ if(d < -(1<<5)) { \
@ -264,49 +240,6 @@ do { \
} while(0) } while(0)
static inline int msgpack_pack_uint8(msgpack_packer* x, uint8_t d)
{
msgpack_pack_real_uint8(x, d);
}
static inline int msgpack_pack_uint16(msgpack_packer* x, uint16_t d)
{
msgpack_pack_real_uint16(x, d);
}
static inline int msgpack_pack_uint32(msgpack_packer* x, uint32_t d)
{
msgpack_pack_real_uint32(x, d);
}
static inline int msgpack_pack_uint64(msgpack_packer* x, uint64_t d)
{
msgpack_pack_real_uint64(x, d);
}
static inline int msgpack_pack_int8(msgpack_packer* x, int8_t d)
{
msgpack_pack_real_int8(x, d);
}
static inline int msgpack_pack_int16(msgpack_packer* x, int16_t d)
{
msgpack_pack_real_int16(x, d);
}
static inline int msgpack_pack_int32(msgpack_packer* x, int32_t d)
{
msgpack_pack_real_int32(x, d);
}
static inline int msgpack_pack_int64(msgpack_packer* x, int64_t d)
{
msgpack_pack_real_int64(x, d);
}
//#ifdef msgpack_pack_inline_func_cint
static inline int msgpack_pack_short(msgpack_packer* x, short d) static inline int msgpack_pack_short(msgpack_packer* x, short d)
{ {
#if defined(SIZEOF_SHORT) #if defined(SIZEOF_SHORT)
@ -372,27 +305,21 @@ if(sizeof(int) == 2) {
static inline int msgpack_pack_long(msgpack_packer* x, long d) static inline int msgpack_pack_long(msgpack_packer* x, long d)
{ {
#if defined(SIZEOF_LONG) #if defined(SIZEOF_LONG)
#if SIZEOF_LONG == 2 #if SIZEOF_LONG == 4
msgpack_pack_real_int16(x, d);
#elif SIZEOF_LONG == 4
msgpack_pack_real_int32(x, d); msgpack_pack_real_int32(x, d);
#else #else
msgpack_pack_real_int64(x, d); msgpack_pack_real_int64(x, d);
#endif #endif
#elif defined(LONG_MAX) #elif defined(LONG_MAX)
#if LONG_MAX == 0x7fffL #if LONG_MAX == 0x7fffffffL
msgpack_pack_real_int16(x, d);
#elif LONG_MAX == 0x7fffffffL
msgpack_pack_real_int32(x, d); msgpack_pack_real_int32(x, d);
#else #else
msgpack_pack_real_int64(x, d); msgpack_pack_real_int64(x, d);
#endif #endif
#else #else
if(sizeof(long) == 2) { if (sizeof(long) == 4) {
msgpack_pack_real_int16(x, d);
} else if(sizeof(long) == 4) {
msgpack_pack_real_int32(x, d); msgpack_pack_real_int32(x, d);
} else { } else {
msgpack_pack_real_int64(x, d); msgpack_pack_real_int64(x, d);
@ -402,162 +329,13 @@ if(sizeof(long) == 2) {
static inline int msgpack_pack_long_long(msgpack_packer* x, long long d) static inline int msgpack_pack_long_long(msgpack_packer* x, long long d)
{ {
#if defined(SIZEOF_LONG_LONG)
#if SIZEOF_LONG_LONG == 2
msgpack_pack_real_int16(x, d);
#elif SIZEOF_LONG_LONG == 4
msgpack_pack_real_int32(x, d);
#else
msgpack_pack_real_int64(x, d); msgpack_pack_real_int64(x, d);
#endif
#elif defined(LLONG_MAX)
#if LLONG_MAX == 0x7fffL
msgpack_pack_real_int16(x, d);
#elif LLONG_MAX == 0x7fffffffL
msgpack_pack_real_int32(x, d);
#else
msgpack_pack_real_int64(x, d);
#endif
#else
if(sizeof(long long) == 2) {
msgpack_pack_real_int16(x, d);
} else if(sizeof(long long) == 4) {
msgpack_pack_real_int32(x, d);
} else {
msgpack_pack_real_int64(x, d);
}
#endif
}
static inline int msgpack_pack_unsigned_short(msgpack_packer* x, unsigned short d)
{
#if defined(SIZEOF_SHORT)
#if SIZEOF_SHORT == 2
msgpack_pack_real_uint16(x, d);
#elif SIZEOF_SHORT == 4
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#elif defined(USHRT_MAX)
#if USHRT_MAX == 0xffffU
msgpack_pack_real_uint16(x, d);
#elif USHRT_MAX == 0xffffffffU
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#else
if(sizeof(unsigned short) == 2) {
msgpack_pack_real_uint16(x, d);
} else if(sizeof(unsigned short) == 4) {
msgpack_pack_real_uint32(x, d);
} else {
msgpack_pack_real_uint64(x, d);
}
#endif
}
static inline int msgpack_pack_unsigned_int(msgpack_packer* x, unsigned int d)
{
#if defined(SIZEOF_INT)
#if SIZEOF_INT == 2
msgpack_pack_real_uint16(x, d);
#elif SIZEOF_INT == 4
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#elif defined(UINT_MAX)
#if UINT_MAX == 0xffffU
msgpack_pack_real_uint16(x, d);
#elif UINT_MAX == 0xffffffffU
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#else
if(sizeof(unsigned int) == 2) {
msgpack_pack_real_uint16(x, d);
} else if(sizeof(unsigned int) == 4) {
msgpack_pack_real_uint32(x, d);
} else {
msgpack_pack_real_uint64(x, d);
}
#endif
}
static inline int msgpack_pack_unsigned_long(msgpack_packer* x, unsigned long d)
{
#if defined(SIZEOF_LONG)
#if SIZEOF_LONG == 2
msgpack_pack_real_uint16(x, d);
#elif SIZEOF_LONG == 4
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#elif defined(ULONG_MAX)
#if ULONG_MAX == 0xffffUL
msgpack_pack_real_uint16(x, d);
#elif ULONG_MAX == 0xffffffffUL
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#else
if(sizeof(unsigned long) == 2) {
msgpack_pack_real_uint16(x, d);
} else if(sizeof(unsigned long) == 4) {
msgpack_pack_real_uint32(x, d);
} else {
msgpack_pack_real_uint64(x, d);
}
#endif
} }
static inline int msgpack_pack_unsigned_long_long(msgpack_packer* x, unsigned long long d) static inline int msgpack_pack_unsigned_long_long(msgpack_packer* x, unsigned long long d)
{ {
#if defined(SIZEOF_LONG_LONG)
#if SIZEOF_LONG_LONG == 2
msgpack_pack_real_uint16(x, d);
#elif SIZEOF_LONG_LONG == 4
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#elif defined(ULLONG_MAX)
#if ULLONG_MAX == 0xffffUL
msgpack_pack_real_uint16(x, d);
#elif ULLONG_MAX == 0xffffffffUL
msgpack_pack_real_uint32(x, d);
#else
msgpack_pack_real_uint64(x, d);
#endif
#else
if(sizeof(unsigned long long) == 2) {
msgpack_pack_real_uint16(x, d);
} else if(sizeof(unsigned long long) == 4) {
msgpack_pack_real_uint32(x, d);
} else {
msgpack_pack_real_uint64(x, d); msgpack_pack_real_uint64(x, d);
} }
#endif
}
//#undef msgpack_pack_inline_func_cint
//#endif
/* /*
@ -810,11 +588,9 @@ static inline int msgpack_pack_timestamp(msgpack_packer* x, int64_t seconds, uin
#undef TAKE8_32 #undef TAKE8_32
#undef TAKE8_64 #undef TAKE8_64
#undef msgpack_pack_real_uint8
#undef msgpack_pack_real_uint16 #undef msgpack_pack_real_uint16
#undef msgpack_pack_real_uint32 #undef msgpack_pack_real_uint32
#undef msgpack_pack_real_uint64 #undef msgpack_pack_real_uint64
#undef msgpack_pack_real_int8
#undef msgpack_pack_real_int16 #undef msgpack_pack_real_int16
#undef msgpack_pack_real_int32 #undef msgpack_pack_real_int32
#undef msgpack_pack_real_int64 #undef msgpack_pack_real_int64

View file

@ -47,7 +47,7 @@ static inline msgpack_unpack_object unpack_callback_root(unpack_user* u)
static inline int unpack_callback_uint16(unpack_user* u, uint16_t d, msgpack_unpack_object* o) static inline int unpack_callback_uint16(unpack_user* u, uint16_t d, msgpack_unpack_object* o)
{ {
PyObject *p = PyInt_FromLong((long)d); PyObject *p = PyLong_FromLong((long)d);
if (!p) if (!p)
return -1; return -1;
*o = p; *o = p;
@ -61,7 +61,7 @@ static inline int unpack_callback_uint8(unpack_user* u, uint8_t d, msgpack_unpac
static inline int unpack_callback_uint32(unpack_user* u, uint32_t d, msgpack_unpack_object* o) static inline int unpack_callback_uint32(unpack_user* u, uint32_t d, msgpack_unpack_object* o)
{ {
PyObject *p = PyInt_FromSize_t((size_t)d); PyObject *p = PyLong_FromSize_t((size_t)d);
if (!p) if (!p)
return -1; return -1;
*o = p; *o = p;
@ -74,7 +74,7 @@ static inline int unpack_callback_uint64(unpack_user* u, uint64_t d, msgpack_unp
if (d > LONG_MAX) { if (d > LONG_MAX) {
p = PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG)d); p = PyLong_FromUnsignedLongLong((unsigned PY_LONG_LONG)d);
} else { } else {
p = PyInt_FromLong((long)d); p = PyLong_FromLong((long)d);
} }
if (!p) if (!p)
return -1; return -1;
@ -84,7 +84,7 @@ static inline int unpack_callback_uint64(unpack_user* u, uint64_t d, msgpack_unp
static inline int unpack_callback_int32(unpack_user* u, int32_t d, msgpack_unpack_object* o) static inline int unpack_callback_int32(unpack_user* u, int32_t d, msgpack_unpack_object* o)
{ {
PyObject *p = PyInt_FromLong(d); PyObject *p = PyLong_FromLong(d);
if (!p) if (!p)
return -1; return -1;
*o = p; *o = p;
@ -107,7 +107,7 @@ static inline int unpack_callback_int64(unpack_user* u, int64_t d, msgpack_unpac
if (d > LONG_MAX || d < LONG_MIN) { if (d > LONG_MAX || d < LONG_MIN) {
p = PyLong_FromLongLong((PY_LONG_LONG)d); p = PyLong_FromLongLong((PY_LONG_LONG)d);
} else { } else {
p = PyInt_FromLong((long)d); p = PyLong_FromLong((long)d);
} }
*o = p; *o = p;
return 0; return 0;

View file

@ -0,0 +1,51 @@
static inline int unpack_container_header(unpack_context* ctx, const char* data, Py_ssize_t len, Py_ssize_t* off)
{
assert(len >= *off);
uint32_t size;
const unsigned char *const p = (unsigned char*)data + *off;
#define inc_offset(inc) \
if (len - *off < inc) \
return 0; \
*off += inc;
switch (*p) {
case var_offset:
inc_offset(3);
size = _msgpack_load16(uint16_t, p + 1);
break;
case var_offset + 1:
inc_offset(5);
size = _msgpack_load32(uint32_t, p + 1);
break;
#ifdef USE_CASE_RANGE
case fixed_offset + 0x0 ... fixed_offset + 0xf:
#else
case fixed_offset + 0x0:
case fixed_offset + 0x1:
case fixed_offset + 0x2:
case fixed_offset + 0x3:
case fixed_offset + 0x4:
case fixed_offset + 0x5:
case fixed_offset + 0x6:
case fixed_offset + 0x7:
case fixed_offset + 0x8:
case fixed_offset + 0x9:
case fixed_offset + 0xa:
case fixed_offset + 0xb:
case fixed_offset + 0xc:
case fixed_offset + 0xd:
case fixed_offset + 0xe:
case fixed_offset + 0xf:
#endif
++*off;
size = ((unsigned int)*p) & 0x0f;
break;
default:
PyErr_SetString(PyExc_ValueError, "Unexpected type header on stream");
return -1;
}
unpack_callback_uint32(&ctx->user, size, &ctx->stack[0].obj);
return 1;
}

View file

@ -75,8 +75,7 @@ static inline void unpack_clear(unpack_context *ctx)
Py_CLEAR(ctx->stack[0].obj); Py_CLEAR(ctx->stack[0].obj);
} }
template <bool construct> static inline int unpack_execute(bool construct, unpack_context* ctx, const char* data, Py_ssize_t len, Py_ssize_t* off)
static inline int unpack_execute(unpack_context* ctx, const char* data, Py_ssize_t len, Py_ssize_t* off)
{ {
assert(len >= *off); assert(len >= *off);
@ -386,6 +385,7 @@ _end:
#undef construct_cb #undef construct_cb
} }
#undef NEXT_CS
#undef SWITCH_RANGE_BEGIN #undef SWITCH_RANGE_BEGIN
#undef SWITCH_RANGE #undef SWITCH_RANGE
#undef SWITCH_RANGE_DEFAULT #undef SWITCH_RANGE_DEFAULT
@ -397,68 +397,27 @@ _end:
#undef again_fixed_trail_if_zero #undef again_fixed_trail_if_zero
#undef start_container #undef start_container
template <unsigned int fixed_offset, unsigned int var_offset> static int unpack_construct(unpack_context *ctx, const char *data, Py_ssize_t len, Py_ssize_t *off) {
static inline int unpack_container_header(unpack_context* ctx, const char* data, Py_ssize_t len, Py_ssize_t* off) return unpack_execute(1, ctx, data, len, off);
{
assert(len >= *off);
uint32_t size;
const unsigned char *const p = (unsigned char*)data + *off;
#define inc_offset(inc) \
if (len - *off < inc) \
return 0; \
*off += inc;
switch (*p) {
case var_offset:
inc_offset(3);
size = _msgpack_load16(uint16_t, p + 1);
break;
case var_offset + 1:
inc_offset(5);
size = _msgpack_load32(uint32_t, p + 1);
break;
#ifdef USE_CASE_RANGE
case fixed_offset + 0x0 ... fixed_offset + 0xf:
#else
case fixed_offset + 0x0:
case fixed_offset + 0x1:
case fixed_offset + 0x2:
case fixed_offset + 0x3:
case fixed_offset + 0x4:
case fixed_offset + 0x5:
case fixed_offset + 0x6:
case fixed_offset + 0x7:
case fixed_offset + 0x8:
case fixed_offset + 0x9:
case fixed_offset + 0xa:
case fixed_offset + 0xb:
case fixed_offset + 0xc:
case fixed_offset + 0xd:
case fixed_offset + 0xe:
case fixed_offset + 0xf:
#endif
++*off;
size = ((unsigned int)*p) & 0x0f;
break;
default:
PyErr_SetString(PyExc_ValueError, "Unexpected type header on stream");
return -1;
} }
unpack_callback_uint32(&ctx->user, size, &ctx->stack[0].obj); static int unpack_skip(unpack_context *ctx, const char *data, Py_ssize_t len, Py_ssize_t *off) {
return 1; return unpack_execute(0, ctx, data, len, off);
} }
#undef SWITCH_RANGE_BEGIN #define unpack_container_header read_array_header
#undef SWITCH_RANGE #define fixed_offset 0x90
#undef SWITCH_RANGE_DEFAULT #define var_offset 0xdc
#undef SWITCH_RANGE_END #include "unpack_container_header.h"
#undef unpack_container_header
#undef fixed_offset
#undef var_offset
static const execute_fn unpack_construct = &unpack_execute<true>; #define unpack_container_header read_map_header
static const execute_fn unpack_skip = &unpack_execute<false>; #define fixed_offset 0x80
static const execute_fn read_array_header = &unpack_container_header<0x90, 0xdc>; #define var_offset 0xde
static const execute_fn read_map_header = &unpack_container_header<0x80, 0xde>; #include "unpack_container_header.h"
#undef unpack_container_header
#undef NEXT_CS #undef fixed_offset
#undef var_offset
/* vim: set ts=4 sw=4 sts=4 expandtab */ /* vim: set ts=4 sw=4 sts=4 expandtab */

View file

@ -1,35 +1,23 @@
[build-system] [build-system]
requires = [ requires = ["setuptools >= 78.1.1"]
# Also declared in requirements.txt, if updating here please also update
# there
"Cython~=3.0.0",
"setuptools >= 35.0.2",
]
build-backend = "setuptools.build_meta" build-backend = "setuptools.build_meta"
[project] [project]
name = "msgpack" name = "msgpack"
dynamic = ["version"] dynamic = ["version"]
license = {text="Apache 2.0"} license = "Apache-2.0"
authors = [{name="Inada Naoki", email="songofacandy@gmail.com"}] authors = [{name="Inada Naoki", email="songofacandy@gmail.com"}]
description = "MessagePack serializer" description = "MessagePack serializer"
readme = "README.md" readme = "README.md"
keywords = ["msgpack", "messagepack", "serializer", "serialization", "binary"] keywords = ["msgpack", "messagepack", "serializer", "serialization", "binary"]
requires-python = ">=3.8" requires-python = ">=3.10"
classifiers = [ classifiers = [
"Development Status :: 5 - Production/Stable", "Development Status :: 5 - Production/Stable",
"Operating System :: OS Independent", "Operating System :: OS Independent",
"Programming Language :: Python", "Topic :: File Formats",
"Programming Language :: Python :: 3", "Intended Audience :: Developers",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy", "Programming Language :: Python :: Implementation :: PyPy",
"Intended Audience :: Developers",
"License :: OSI Approved :: Apache Software License",
] ]
[project.urls] [project.urls]
@ -39,20 +27,19 @@ Repository = "https://github.com/msgpack/msgpack-python/"
Tracker = "https://github.com/msgpack/msgpack-python/issues" Tracker = "https://github.com/msgpack/msgpack-python/issues"
Changelog = "https://github.com/msgpack/msgpack-python/blob/main/ChangeLog.rst" Changelog = "https://github.com/msgpack/msgpack-python/blob/main/ChangeLog.rst"
[tool.setuptools]
# Do not install C/C++/Cython source files
include-package-data = false
[tool.setuptools.dynamic] [tool.setuptools.dynamic]
version = {attr = "msgpack.__version__"} version = {attr = "msgpack.__version__"}
[tool.black]
line-length = 100
target-version = ["py37"]
skip_string_normalization = true
[tool.ruff] [tool.ruff]
line-length = 100 line-length = 100
target-version = "py38" target-version = "py310"
ignore = [] lint.select = [
"E", # pycodestyle
[tool.ruff.per-file-ignores] "F", # Pyflakes
"msgpack/__init__.py" = ["F401", "F403"] "I", # isort
"msgpack/fallback.py" = ["E731"] #"UP", pyupgrade
"test/test_seq.py" = ["E501"] ]

View file

@ -1,7 +1,3 @@
# Also declared in pyproject.toml, if updating here please also update there. Cython==3.1.4
Cython~=3.0.0 setuptools==78.1.1
build
# Tools required only for development. No need to add it to pyproject.toml file.
black==23.3.0
pytest==7.3.1
pyupgrade==3.3.2

65
setup.py Executable file → Normal file
View file

@ -1,81 +1,24 @@
#!/usr/bin/env python #!/usr/bin/env python
import os import os
import sys import sys
from setuptools import setup, Extension
from setuptools.command.build_ext import build_ext
from setuptools.command.sdist import sdist
from setuptools import Extension, setup
PYPY = hasattr(sys, "pypy_version_info") PYPY = hasattr(sys, "pypy_version_info")
class NoCython(Exception):
pass
try:
import Cython.Compiler.Main as cython_compiler
have_cython = True
except ImportError:
have_cython = False
def cythonize(src):
sys.stderr.write(f"cythonize: {src!r}\n")
cython_compiler.compile([src], cplus=True)
def ensure_source(src):
pyx = os.path.splitext(src)[0] + ".pyx"
if not os.path.exists(src):
if not have_cython:
raise NoCython
cythonize(pyx)
elif os.path.exists(pyx) and os.stat(src).st_mtime < os.stat(pyx).st_mtime and have_cython:
cythonize(pyx)
return src
class BuildExt(build_ext):
def build_extension(self, ext):
try:
ext.sources = list(map(ensure_source, ext.sources))
except NoCython:
print("WARNING")
print("Cython is required for building extension from checkout.")
print("Install Cython >= 0.16 or install msgpack from PyPI.")
print("Falling back to pure Python implementation.")
return
try:
return build_ext.build_extension(self, ext)
except Exception as e:
print("WARNING: Failed to compile extension modules.")
print("msgpack uses fallback pure python implementation.")
print(e)
# Cython is required for sdist
class Sdist(sdist):
def __init__(self, *args, **kwargs):
cythonize("msgpack/_cmsgpack.pyx")
sdist.__init__(self, *args, **kwargs)
libraries = [] libraries = []
macros = [] macros = []
ext_modules = []
if sys.platform == "win32": if sys.platform == "win32":
libraries.append("ws2_32") libraries.append("ws2_32")
macros = [("__LITTLE_ENDIAN__", "1")] macros = [("__LITTLE_ENDIAN__", "1")]
ext_modules = []
if not PYPY and not os.environ.get("MSGPACK_PUREPYTHON"): if not PYPY and not os.environ.get("MSGPACK_PUREPYTHON"):
ext_modules.append( ext_modules.append(
Extension( Extension(
"msgpack._cmsgpack", "msgpack._cmsgpack",
sources=["msgpack/_cmsgpack.cpp"], sources=["msgpack/_cmsgpack.c"],
libraries=libraries, libraries=libraries,
include_dirs=["."], include_dirs=["."],
define_macros=macros, define_macros=macros,
@ -83,9 +26,7 @@ if not PYPY and not os.environ.get("MSGPACK_PUREPYTHON"):
) )
del libraries, macros del libraries, macros
setup( setup(
cmdclass={"build_ext": BuildExt, "sdist": Sdist},
ext_modules=ext_modules, ext_modules=ext_modules,
packages=["msgpack"], packages=["msgpack"],
) )

View file

@ -1,6 +1,6 @@
#!/usr/bin/env python from pytest import raises
from msgpack import packb, unpackb from msgpack import Packer, packb, unpackb
def test_unpack_buffer(): def test_unpack_buffer():
@ -17,7 +17,7 @@ def test_unpack_bytearray():
obj = unpackb(buf, use_list=1) obj = unpackb(buf, use_list=1)
assert [b"foo", b"bar"] == obj assert [b"foo", b"bar"] == obj
expected_type = bytes expected_type = bytes
assert all(type(s) == expected_type for s in obj) assert all(type(s) is expected_type for s in obj)
def test_unpack_memoryview(): def test_unpack_memoryview():
@ -26,4 +26,24 @@ def test_unpack_memoryview():
obj = unpackb(view, use_list=1) obj = unpackb(view, use_list=1)
assert [b"foo", b"bar"] == obj assert [b"foo", b"bar"] == obj
expected_type = bytes expected_type = bytes
assert all(type(s) == expected_type for s in obj) assert all(type(s) is expected_type for s in obj)
def test_packer_getbuffer():
packer = Packer(autoreset=False)
packer.pack_array_header(2)
packer.pack(42)
packer.pack("hello")
buffer = packer.getbuffer()
assert isinstance(buffer, memoryview)
assert bytes(buffer) == b"\x92*\xa5hello"
if Packer.__module__ == "msgpack._cmsgpack": # only for Cython
# cython Packer supports buffer protocol directly
assert bytes(packer) == b"\x92*\xa5hello"
with raises(BufferError):
packer.pack(42)
buffer.release()
packer.pack(42)
assert bytes(packer) == b"\x92*\xa5hello*"

View file

@ -1,10 +1,11 @@
#!/usr/bin/env python #!/usr/bin/env python
from pytest import raises
from msgpack import packb, unpackb, Unpacker, FormatError, StackError, OutOfData
import datetime import datetime
from pytest import raises
from msgpack import FormatError, OutOfData, StackError, Unpacker, packb, unpackb
class DummyException(Exception): class DummyException(Exception):
pass pass

View file

@ -1,4 +1,5 @@
import array import array
import msgpack import msgpack
from msgpack import ExtType from msgpack import ExtType

View file

@ -2,14 +2,14 @@
import pytest import pytest
from msgpack import ( from msgpack import (
packb,
unpackb,
Packer,
Unpacker,
ExtType, ExtType,
Packer,
PackOverflowError, PackOverflowError,
PackValueError, PackValueError,
Unpacker,
UnpackValueError, UnpackValueError,
packb,
unpackb,
) )

View file

@ -1,6 +1,7 @@
#!/usr/bin/env python #!/usr/bin/env python
from array import array from array import array
from msgpack import packb, unpackb from msgpack import packb, unpackb
@ -95,4 +96,4 @@ def test_multidim_memoryview():
view = memoryview(b"\00" * 6) view = memoryview(b"\00" * 6)
data = view.cast(view.format, (3, 2)) data = view.cast(view.format, (3, 2))
packed = packb(data) packed = packb(data)
assert packed == b'\xc4\x06\x00\x00\x00\x00\x00\x00' assert packed == b"\xc4\x06\x00\x00\x00\x00\x00\x00"

View file

@ -1,4 +1,4 @@
from msgpack import packb, unpackb, ExtType from msgpack import ExtType, packb, unpackb
def test_str8(): def test_str8():

View file

@ -1,6 +1,7 @@
#!/usr/bin/env python #!/usr/bin/env python
from pytest import raises from pytest import raises
from msgpack import packb, unpackb from msgpack import packb, unpackb

View file

@ -1,12 +1,12 @@
#!/usr/bin/env python #!/usr/bin/env python
import struct
from collections import OrderedDict from collections import OrderedDict
from io import BytesIO from io import BytesIO
import struct
import pytest import pytest
from msgpack import packb, unpackb, Unpacker, Packer from msgpack import Packer, Unpacker, packb, unpackb
def check(data, use_list=False): def check(data, use_list=False):
@ -89,7 +89,7 @@ def testStrictUnicodeUnpack():
def testIgnoreErrorsPack(): def testIgnoreErrorsPack():
re = unpackb( re = unpackb(
packb("abc\uDC80\uDCFFdef", use_bin_type=True, unicode_errors="ignore"), packb("abc\udc80\udcffdef", use_bin_type=True, unicode_errors="ignore"),
raw=False, raw=False,
use_list=1, use_list=1,
) )

View file

@ -1,5 +1,6 @@
"""Test Unpacker's read_array_header and read_map_header methods""" """Test Unpacker's read_array_header and read_map_header methods"""
from msgpack import packb, Unpacker, OutOfData
from msgpack import OutOfData, Unpacker, packb
UnexpectedTypeException = ValueError UnexpectedTypeException = ValueError

View file

@ -1,8 +1,8 @@
#!/usr/bin/env python # ruff: noqa: E501
# ignore line length limit for long comments
import io import io
import msgpack
import msgpack
binarydata = bytes(bytearray(range(256))) binarydata = bytes(bytearray(range(256)))

View file

@ -1,10 +1,11 @@
#!/usr/bin/env python #!/usr/bin/env python
import io import io
from msgpack import Unpacker, BufferFull
from msgpack import pack, packb
from msgpack.exceptions import OutOfData
from pytest import raises from pytest import raises
from msgpack import BufferFull, Unpacker, pack, packb
from msgpack.exceptions import OutOfData
def test_partialdata(): def test_partialdata():
unpacker = Unpacker() unpacker = Unpacker()

View file

@ -1,5 +1,6 @@
from collections import namedtuple from collections import namedtuple
from msgpack import packb, unpackb, ExtType
from msgpack import ExtType, packb, unpackb
def test_namedtuple(): def test_namedtuple():

View file

@ -1,8 +1,9 @@
#!/usr/bin/env python #!/usr/bin/env python
from msgpack import packb
from collections import namedtuple from collections import namedtuple
from msgpack import packb
class MyList(list): class MyList(list):
pass pass

View file

@ -1,5 +1,7 @@
import pytest
import datetime import datetime
import pytest
import msgpack import msgpack
from msgpack.ext import Timestamp from msgpack.ext import Timestamp
@ -86,6 +88,21 @@ def test_timestamp_datetime():
utc = datetime.timezone.utc utc = datetime.timezone.utc
assert t.to_datetime() == datetime.datetime(1970, 1, 1, 0, 0, 42, 0, tzinfo=utc) assert t.to_datetime() == datetime.datetime(1970, 1, 1, 0, 0, 42, 0, tzinfo=utc)
ts = datetime.datetime(2024, 4, 16, 8, 43, 9, 420317, tzinfo=utc)
ts2 = datetime.datetime(2024, 4, 16, 8, 43, 9, 420318, tzinfo=utc)
assert (
Timestamp.from_datetime(ts2).nanoseconds - Timestamp.from_datetime(ts).nanoseconds == 1000
)
ts3 = datetime.datetime(2024, 4, 16, 8, 43, 9, 4256)
ts4 = datetime.datetime(2024, 4, 16, 8, 43, 9, 4257)
assert (
Timestamp.from_datetime(ts4).nanoseconds - Timestamp.from_datetime(ts3).nanoseconds == 1000
)
assert Timestamp.from_datetime(ts).to_datetime() == ts
def test_unpack_datetime(): def test_unpack_datetime():
t = Timestamp(42, 14) t = Timestamp(42, 14)

View file

@ -1,12 +1,9 @@
from io import BytesIO
import sys import sys
from msgpack import Unpacker, packb, OutOfData, ExtType from io import BytesIO
from pytest import raises, mark
try: from pytest import mark, raises
from itertools import izip as zip
except ImportError: from msgpack import ExtType, OutOfData, Unpacker, packb
pass
def test_unpack_array_header_from_file(): def test_unpack_array_header_from_file():

38
tox.ini
View file

@ -1,38 +0,0 @@
[tox]
envlist =
{py35,py36,py37,py38}-{c,pure},
{pypy,pypy3}-pure,
py34-x86,
sphinx,
isolated_build = true
[testenv]
deps=
pytest
changedir=test
commands=
c,x86: python -c 'from msgpack import _cmsgpack'
c,x86: py.test
pure: py.test
setenv=
pure: MSGPACK_PUREPYTHON=x
[testenv:py34-x86]
basepython=python3.4-x86
deps=
pytest
changedir=test
commands=
python -c 'import sys; print(hex(sys.maxsize))'
python -c 'from msgpack import _cmsgpack'
py.test
[testenv:sphinx]
changedir = docs
deps =
sphinx
commands =
sphinx-build -n -v -W --keep-going -b html -d {envtmpdir}/doctrees . {envtmpdir}/html