* [3.8] bpo-38588: Fix possible crashes in dict and list when calling PyObject_RichCompareBool (GH-17734)
Take strong references before calling PyObject_RichCompareBool to protect against the case
where the object dies during the call.
(cherry picked from commit 2d5bf568ea)
Co-authored-by: Dong-hee Na <donghee.na92@gmail.com>
* Update Objects/listobject.c
@methane's suggestion
Co-Authored-By: Inada Naoki <songofacandy@gmail.com>
Co-authored-by: Inada Naoki <songofacandy@gmail.com>
Hold strong references to list elements while calling PyObject_RichCompareBool().
(cherry picked from commit d9e561d23d)
Co-authored-by: Zackery Spytz <zspytz@gmail.com>
…nctions with asserts
The actual overflow can never happen because of the following:
* The size of a list can't be greater than PY_SSIZE_T_MAX / sizeof(PyObject*).
* The size of a pointer on all supported plaftorms is at least 4 bytes.
* ofs is positive and less than the list size at the beginning of each iteration.
https://bugs.python.org/issue35091
Add new trashcan macros to deal with a double deallocation that could occur when the `tp_dealloc` of a subclass calls the `tp_dealloc` of a base class and that base class uses the trashcan mechanism.
Patch by Jeroen Demeyer.
There is already a `Py_ssize_t i` defined at function scope that is used
for similar loops. By removing the local `int i` declaration that `i` is
used, which has the appropriate type.
The accu.h header is no longer part of the Python C API: it has been
moved to the "internal" headers which are restricted to Python
itself.
Replace #include "accu.h" with #include "pycore_accu.h".
The list() constructor isn't taking full advantage of known input
lengths or length hints. This commit makes the constructor
pre-size and not over-allocate when the input size is known (the
input collection implements __len__). One on the main advantages is
that this provides 12% difference in memory savings due to the difference
between overallocating and allocating exactly the input size.
For efficiency purposes and to avoid a performance regression for small
generators and collections, the size of the input object is calculated using
__len__ and not __length_hint__, as the later is considerably slower.
METH_NOARGS functions need only a single argument but they are cast
into a PyCFunction, which takes two arguments. This triggers an
invalid function cast warning in gcc8 due to the argument mismatch.
Fix this by adding a dummy unused argument.