| Commit message (Collapse) | Author | Age | Files | Lines |
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Previously an empty array resulting from split always had dimension 1-D.
In Numpy 1.9 a FutureWarning was raised to notify users that it was
planned to preserve the dimensions of empty arrays in a future numpy
release. This removes the FutureWarning and implements preservation of
dimensions.
Note that there was a bug in numpy 1.9 and the dimensions of empty
arrays was already preserved in some cases and no warning was issued.
This PR fixes that inconsistency by preserving the dimensions in all
cases rather than fixing the bug, as the dimension preserving behavior
was already depended on by some users. See the discussion in gh-6575
about this change.
|
|\
| |
| | |
BUG: Make sure warning for array split is always applied
|
| |
| |
| |
| |
| |
| |
| |
| | |
Zero arrays can also occur with any of the partitions sub_arys[i]
induced by array_split, not just the final partition sub_arys[-1].
Modified by seberg.
Closes gh-5771
|
| | |
|
|\ \
| |/
|/| |
ENH: add np.stack
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The motivation here is to present a uniform and N-dimensional interface for
joining arrays along a new axis, similarly to how `concatenate` provides a
uniform and N-dimensional interface for joining arrays along an existing axis.
Background
~~~~~~~~~~
Currently, users can choose between `hstack`, `vstack`, `column_stack` and
`dstack`, but none of these functions handle N-dimensional input. In my
opinion, it's also difficult to keep track of the differences between these
methods and to predict how they will handle input with different
dimensions.
In the past, my preferred approach has been to either construct the result
array explicitly and use indexing for assignment, to or use `np.array` to
stack along the first dimension and then use `transpose` (or a similar method)
to reorder dimensions if necessary. This is pretty awkward.
I brought this proposal up a few weeks on the numpy-discussion list:
http://mail.scipy.org/pipermail/numpy-discussion/2015-February/072199.html
I also received positive feedback on Twitter:
https://twitter.com/shoyer/status/565937244599377920
Implementation notes
~~~~~~~~~~~~~~~~~~~~
The one line summaries for `concatenate` and `stack` have been (re)written to
mirror each other, and to make clear that the distinction between these functions
is whether they join over an existing or new axis.
In general, I've tweaked the documentation and docstrings with an eye toward
pointing users to `concatenate`/`stack`/`split` as a fundamental set of basic
array manipulation routines, and away from
`array_split`/`{h,v,d}split`/`{h,v,d,column_}stack`
I put this implementation in `numpy.core.shape_base` alongside `hstack`/`vstack`,
but it appears that there is also a `numpy.lib.shape_base` module that contains
another larger set of functions, including `dstack`. I'm not really sure where
this belongs (or if it even matters).
Finally, it might be a good idea to write a masked array version of `stack`.
But I don't use masked arrays, so I'm not well motivated to do that.
|
| |
| |
| | |
So removed the paranthesis and included the return statement.
|
|/
|
|
|
| |
Tile now copies the input when it is a numpy array and all dimensions are
repeated only once.
|
|
|
|
| |
This bug was reported in Debian as: http://bugs.debian.org/777172 .
|
|
|
|
| |
The rules enforced are the same as those used for scipy.
|
|
|
|
|
|
| |
Some of those problems look like potential coding errors. In those
cases a Fixme comment was made and the offending code, usually an
unused variable, was commented out.
|
|\
| |
| | |
ENH: apply_along_axis accepts named arguments
|
| | |
|
| | |
|
|/
|
|
|
|
|
| |
Remove misleading note about equivalency betwen column_stack and
np.vstack(tup).T.
Fixes #3488
|
| |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Fixes the result type of empty output arrays.
The FutureWarning warns about changed behaviour. A "kludge" was
introduced into array split converting converting the result of
something like:
>>> np.array_split(np.array([[1, 1]]), 2)
[array([[1, 1]]), array([], dtype=int64)]
instead of retaining the shape of the empty/second array to
be (0, 2). A FutureWarning is now raised when such a
replacement occurs.
Closes gh-612
|
|
|
|
|
|
|
| |
Run the 2to3 ws_comma fixer on *.py files. Some lines are now too long
and will need to be broken at some point. OTOH, some lines were already
too long and need to be broken at some point. Now seems as good a time
as any to do this with open PRs at a minimum.
|
|
|
|
| |
Also minor changes in the documentation.
|
|
|
|
| |
Following deprecations would cause problems otherwise.
|
|\
| |
| | |
DOC: Change example to demonstrate function
|
| |
| |
| | |
"a * 0.5" example might as well be new_func(a) directly, it doesn't demonstrate the purpose of apply_along_axis().
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The idioms fixer makes the following replacements.
1) int <- bool
2) comparison or identity of types <- isinstance
3) a.sort() <- sorted(a)
There were two problems that needed to be dealt with after the
application of the fixer. First, the replacement of comparison or
identity of types by isinstance was not always correct. The isinstance
function returns true for subtypes whereas many of the places where the
fixer made a substitution needed to check for exact type equality.
Second, the sorted function was applied to arrays, but because it treats
them as iterators and constructs a sorted list from the result, that is
the wrong thing to do.
Closes #3062.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
In Python 3 `map` is an iterator while in Python 2 it returns a list.
The simple fix applied by the fixer is to inclose all instances of map
with `list(...)`. This is not needed in all cases, and even where
appropriate list comprehensions may be preferred for their clarity.
Consequently, this patch attempts to use list comprehensions where it
makes sense.
When the mapped function has two arguments there is another problem that
can arise. In Python 3 map stops execution when the shortest argument
list is exhausted, while in Python 2 it stops when the longest argument
list is exhausted. Consequently the two argument case might need special
care. However, we have been running Python3 converted versions of numpy
since 1.5 without problems, so it is probably not something that affects
us.
Closes #3068
|
|
|
|
|
|
|
| |
Add `print_function` to all `from __future__ import ...` statements
and use the python3 print function syntax everywhere.
Closes #3078.
|
|\
| |
| | |
DOC: Formatting fixes using regex
|
| |
| |
| |
| | |
also other spacing or formatting mistakes
|
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| | |
The new import `absolute_import` is added the `from __future__ import`
statement and The 2to3 `import` fixer is run to make the imports
compatible. There are several things that need to be dealt with to make
this work.
1) Files meant to be run as scripts run in a different environment than
files imported as part of a package, and so changes to those files need
to be skipped. The affected script files are:
* all setup.py files
* numpy/core/code_generators/generate_umath.py
* numpy/core/code_generators/generate_numpy_api.py
* numpy/core/code_generators/generate_ufunc_api.py
2) Some imported modules are not available as they are created during
the build process and consequently 2to3 is unable to handle them
correctly. Files that import those modules need a bit of extra work.
The affected files are:
* core/__init__.py,
* core/numeric.py,
* core/_internal.py,
* core/arrayprint.py,
* core/fromnumeric.py,
* numpy/__init__.py,
* lib/npyio.py,
* lib/function_base.py,
* fft/fftpack.py,
* random/__init__.py
Closes #3172
|
|/
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
In python3 range is an iterator and `xrange` has been removed. This has
two consequence for code:
1) Where a list is needed `list(range(...))` must be used.
2) `xrange` must be replaced by `range`
Both of these changes also work in python2 and this patch makes both.
There are three places fixed that do not need it, but I left them in
so that the result would be `xrange` clean.
Closes #3092
|
|
|
|
|
|
|
|
| |
This should be harmless, as we already are division clean. However,
placement of this import takes some care. In the future a script
can be used to append new features without worry, at least until
such time as it exceeds a single line. Having that ability will
make it easier to deal with absolute imports and printing updates.
|
|
|
|
| |
Equal and nearly-equal size requirement is not true when passing a 1-D array of indices.
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
| |
|
|
|
|
| |
they are still an improvement)
|
| |
|
|
|
|
|
| |
of explicit imports or dependency on the local scope where the doctest is
defined..
|
|
|
|
|
|
|
| |
of an explicit
import.
Remove ">>>" from bartlett plotting example since it currently requires matplotlib.
|
| |
|
| |
|