| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
| |
PyArray_ISFORTRAN was used to implement array_is_fortran in numpy.i when
what was wanted was PyArray_IS_F_CONTIGUOUS. The difference is that
PyArray_ISFORTRAN will return False if the array is c_contiguous.
Previous to relaxed stride checking this did not matter, but currently
arrays with ndim > 1 may be both C and Fortran contiguous and that
results in errors when PyArray_ISFORTRAN is mistakenly used to check for
Fortran contiguity.
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
All jobs currently run on Travis's legacy infrastructure - which supports sudo.
The newer container-based infrastructure is faster, but doesn't allow sudo.
This patch
- sets sudo=false for all jobs, except the chroot job
- uses Travis's apt addon to install all packages
- installs eatmydata for all jobs to reduce disk IO
- removes the tmpfs workaround for chroot builds
|
| |
|
|
|
|
|
| |
(DATA_TYPE* INPLACE_ARRAY_FLAT, DIM_TYPE DIM_FLAT)
Added unittests, updated documentation.
|
|
|
|
| |
Follow-up of gh-5829, which got merged too early.
|
|\
| |
| | |
TST: re-enable TravisCI testing with Bento.
|
| |
| |
| |
| |
| | |
Disabling was done in gh-5708, due to the Waf download site being down for a
while.
|
|/
|
|
|
|
|
|
|
| |
Module tests whether we can run f2py and return correct version.
Skip this test when running in-place (we don't install f2py in that
case).
Use our own virtualenvs in travis-ci to avoid picking up travis' numpy.
|
|
|
|
|
|
|
|
|
| |
Reason: numpy.i is supposed to be copied, not used from within an installed
Numpy version.
Closes gh-5690
[ci skip]
|
|
|
|
|
|
|
| |
For example, we had 256 errors (etc.) our process will exit with a
successful error code which is incorrect and/or misleading.
Signed-off-by: Chris Lamb <chris@chris-lamb.co.uk>
|
| |
|
|
|
|
|
|
| |
numpy.i now includes ready-made typemaps for std::complex<float>
and std::complex<double> . Tests were added to testArray using
a newly defined ArrayZ class.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Allow extensions using numpy.distutils to compile in parallel.
By passing `--jobs=n` or `-j n` to `setup.py build` the compilation of
extensions is now performed in `n` parallel processes.
Additionally the environment variable NPY_NUM_BUILD_JOBS is used as
the default value, if its unset the default is serial compilation.
The parallelization is limited to within the files of an extension, so
only numpy multiarraymodule really profits but its still a nice
improvement when you have 2-4 cores.
Unfortunately Cython will not profit at all as it tends to build one
module per file.
|
|\
| |
| |
| | |
BUG: Make numpy import when run with Python flag '-OO
|
| | |
|
| | |
|
|\ \
| | |
| | | |
MAINT: start 1.10-devel.
|
| | | |
|
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | |
| | | |
When a cython object contains numpy arrays the pure python allocation
hook can trigger during cythons __dealloc__ method and trigger a second
deletion of the object currently being deleted.
To minimize the probabily that that happens disable the garbage
collector during the hook.
As this involves python calls it is still possible that a double delete
occurs but chances are lowered, a proper solution would be C only hook
like python 3.4 tracemalloc module.
Closes gh-4834
|
| |/
|/|
| |
| |
| |
| | |
The typemaps with size parameters after the array pointer were correct,
but the typemaps with size parameters before the array pointer created
arrays with C ordering.
|
| |
| |
| |
| |
| |
| | |
using scons seems to fail with wine 1.6, but one only needs to run three
commands so its simpler to just put these into a script instead of
trying to debug scons.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The argout typemaps of all the ARGOUTVIEWM typemap suites make use of the
free_cap function, defined in the NumPy_Utilities fragment, which was not
listed in the fragment argument of the typemaps.
As a result, the free_cap function is not included in the source generated
code by SWIG, and the code do not compile (unless another typemap including
the missing fragment has been used).
|
|\ \
| | |
| | | |
BUG: cythonize sources for wheel build
|
| | |
| | |
| | |
| | |
| | | |
Add travis build that builds and installs wheel, runs tests from
installed wheel.
|
| |/
|/|
| |
| | |
travis boxes are again badly out of date
|
|/
|
|
| |
allows for better static analysis report with e.g. cpychecker
|
|
|
|
|
|
|
| |
Instead generate at build time. The generated sources are still part of
the sdist.
tools/cythonize.py is copied from SciPy with small changes to the
configuration.
|
| |
|
|
|
|
|
|
|
| |
Also update MANIFEST.in and documentation to reflect the move. The
discussion of this change is at #2384.
Closes #2384. Closes #4374.
|
|
|
|
| |
run manual apt-get update to pick up the latest py3 security update
|
|
|
|
| |
to do that fix a missing $DIST variable usage in the test script
|
|
|
|
|
|
| |
during bootstrap eatmydata is not available in the chroot so it takes a
significant time. Avoid this by placing the whole chroot in a tmpfs
ramdisk.
|
|
|
|
| |
required for windows compilers
|
|
|
|
|
| |
python-dbg adds couple extra asserts on reference counts, memory
allocation and also enables a few numpy internal asserts.
|
| |
|
| |
|
|
|
|
|
|
|
|
| |
Implemented by setting up a chroot in the travis VM and moving all the
test logic, including bento build, to a single script.
This is still reasonable fast, about twice as slow as the normal tests.
When Travis updates to a newer OS it can be replaced by standard cross
compiling.
|
|
|
|
| |
Works with Python 3.x now that bdist_mpkg is ported (thanks @matthew-brett).
|
|
|
|
|
|
|
| |
Run the 2to3 ws_comma fixer on *.py files. Some lines are now too long
and will need to be broken at some point. OTOH, some lines were already
too long and need to be broken at some point. Now seems as good a time
as any to do this with open PRs at a minimum.
|
|
|
|
| |
Now is as good a time as any with open PR's at a low.
|
|
|
|
|
|
|
|
| |
These scripts have been broken for years (build.py errors out
immediately if not running from an *svn* checkout), so I'm sure
no-one's using them, and they only attract pointless maintenance fixes
(py3 changes, keeping the pointless extra copy of the top-level
README.txt up to date as in #3288). Let's just remove them.
|
|\
| |
| | |
BUG: Fix some README links to point to www.numpy.org
|
| |
| |
| |
| | |
Fixes #3288
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The idioms fixer makes the following replacements.
1) int <- bool
2) comparison or identity of types <- isinstance
3) a.sort() <- sorted(a)
There were two problems that needed to be dealt with after the
application of the fixer. First, the replacement of comparison or
identity of types by isinstance was not always correct. The isinstance
function returns true for subtypes whereas many of the places where the
fixer made a substitution needed to check for exact type equality.
Second, the sorted function was applied to arrays, but because it treats
them as iterators and constructs a sorted list from the result, that is
the wrong thing to do.
Closes #3062.
|
|
|
|
|
| |
The tools/py3tool.py file was responsible for running 2to3. Now that
2to3 is no longer run it is not needed and can be removed.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The unicode fixer strips the u from u'hi' and converts the unicode type
to str. The first won't work for Python 2 and instead we replace the u
prefix with the sixu function borrowed from the six compatibility
package. That function calls the unicode constructor with the
'unicode_escape' encoder so that the many tests using escaped unicode
characters like u'\u0900' will be handled correctly. That makes the
sixu function a bit different from the asunicode function currently in
numpy.compat and also provides a target that can be converted back to
the u prefix when support for Python 3.2 is dropped. Python 3.3
reintroduced the u prefix for compatibility.
The unicode fixer also replaces 'unicode' with 'str' as 'unicode' is no
longer a builtin in Python 3. For code compatibility, 'unicode' is
defined either as 'str' or 'unicode' in numpy.compat so that checks like
if isinstance(x, unicode):
...
will work properly for all python versions.
Closes #3089.
|
|\
| |
| | |
2to3: Apply types fixer.
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
Python 3 removes the builtin types from the types module. The types
fixer replaces such references with the builtin types where possible
and also takes care of some special cases:
types.TypeNone <- type(None)
types.NotImplementedType <- type(NotImplemented)
types.EllipsisType <- type(Ellipsis)
The only two tricky substitutions are
types.StringType <- bytes
types.LongType <- int
These are fixed up to support both Python 3 and Python 2 code by
importing the long and bytes types from numpy.compat.
Closes #3240.
|