diff options
author | Mike Bayer <mike_mp@zzzcomputing.com> | 2015-04-30 12:55:29 -0400 |
---|---|---|
committer | Mike Bayer <mike_mp@zzzcomputing.com> | 2015-04-30 12:55:29 -0400 |
commit | adb66da06f571b19ee10720977f86d2d38c01681 (patch) | |
tree | 4cf355bc69fcce940947a5c5f9b2d68c85282ba4 | |
parent | da8e374f00f4f6b692f55aa7fed253832c74f826 (diff) | |
parent | 20e3df602846bb1d8940b5138f21ef203c99bade (diff) | |
download | sqlalchemy-adb66da06f571b19ee10720977f86d2d38c01681.tar.gz |
Merge branch 'master' into ticket_3355
37 files changed, 950 insertions, 197 deletions
diff --git a/doc/build/changelog/changelog_09.rst b/doc/build/changelog/changelog_09.rst index 2f2f59263..778881b90 100644 --- a/doc/build/changelog/changelog_09.rst +++ b/doc/build/changelog/changelog_09.rst @@ -110,6 +110,9 @@ Compared to the existing entry for ``"type"``, it will always be a mapped entity, even if extracted from a column expression, or None if the given expression is a pure core expression. + See also :ticket:`3403` which repaired a regression in this feature + which was unreleased in 0.9.10 but was released in the 1.0 version. + .. changelog:: :version: 0.9.9 diff --git a/doc/build/changelog/changelog_10.rst b/doc/build/changelog/changelog_10.rst index 290ecd320..5fe587681 100644 --- a/doc/build/changelog/changelog_10.rst +++ b/doc/build/changelog/changelog_10.rst @@ -16,6 +16,174 @@ :start-line: 5 .. changelog:: + :version: 1.0.3 + + .. change:: + :tags: bug, orm + :tickets: 3403, 3320 + + Fixed regression from as yet unreleased 0.9.10 where the new addition + of ``entity`` to the :attr:`.Query.column_descriptions` accessor + would fail if the target entity was produced from a core selectable + such as a :class:`.Table` or :class:`.CTE` object. + + .. change:: + :tags: feature, sql + + Added a placeholder method :meth:`.TypeEngine.compare_against_backend` + which is now consumed by Alembic migrations as of 0.7.6. User-defined + types can implement this method to assist in the comparison of + a type against one reflected from the database. + + .. change:: + :tags: bug, orm + :tickets: 3402 + + Fixed regression within the flush process when an attribute were + set to a SQL expression for an UPDATE, and the SQL expression when + compared to the previous value of the attribute would produce a SQL + comparison other than ``==`` or ``!=``, the exception "Boolean value + of this clause is not defined" would raise. The fix ensures that + the unit of work will not interpret the SQL expression in this way. + + .. change:: + :tags: bug, ext + :tickets: 3397 + + Fixed bug in association proxy where an any()/has() + on an relationship->scalar non-object attribute comparison would fail, + e.g. + ``filter(Parent.some_collection_to_attribute.any(Child.attr == 'foo'))`` + + .. change:: + :tags: bug, sql + :tickets: 3396 + + Fixed bug where the truncation of long labels in SQL could produce + a label that overlapped another label that is not truncated; this + because the length threshhold for truncation was greater than + the portion of the label that remains after truncation. These + two values have now been made the same; label_length - 6. + The effect here is that shorter column labels will be "truncated" + where they would not have been truncated before. + + .. change:: + :tags: bug, orm + :tickets: 3392 + + Fixed regression due to :ticket:`2992` where textual elements placed + into the :meth:`.Query.order_by` clause in conjunction with joined + eager loading would be added to the columns clause of the inner query + in such a way that they were assumed to be table-bound column names, + in the case where the joined eager load needs to wrap the query + in a subquery to accommodate for a limit/offset. + + Originally, the behavior here was intentional, in that a query such + as ``query(User).order_by('name').limit(1)`` + would order by ``user.name`` even if the query was modified by + joined eager loading to be within a subquery, as ``'name'`` would + be interpreted as a symbol to be located within the FROM clauses, + in this case ``User.name``, which would then be copied into the + columns clause to ensure it were present for ORDER BY. However, the + feature fails to anticipate the case where ``order_by("name")`` refers + to a specific label name present in the local columns clause already + and not a name bound to a selectable in the FROM clause. + + Beyond that, the feature also fails for deprecated cases such as + ``order_by("name desc")``, which, while it emits a + warning that :func:`.text` should be used here (note that the issue + does not impact cases where :func:`.text` is used explicitly), + still produces a different query than previously where the "name desc" + expression is copied into the columns clause inappropriately. The + resolution is such that the "joined eager loading" aspect of the + feature will skip over these so-called "label reference" expressions + when augmenting the inner columns clause, as though they were + :func:`.text` constructs already. + + .. change:: + :tags: bug, sql + :tickets: 3391 + + Fixed regression due to :ticket:`3282` where the ``tables`` collection + passed as a keyword argument to the :meth:`.DDLEvents.before_create`, + :meth:`.DDLEvents.after_create`, :meth:`.DDLEvents.before_drop`, and + :meth:`.DDLEvents.after_drop` events would no longer be a list + of tables, but instead a list of tuples which contained a second + entry with foreign keys to be added or dropped. As the ``tables`` + collection, while documented as not necessarily stable, has come + to be relied upon, this change is considered a regression. + Additionally, in some cases for "drop", this collection would + be an iterator that would cause the operation to fail if + prematurely iterated. The collection is now a list of table + objects in all cases and test coverage for the format of this + collection is now added. + + + .. change:: + :tags: bug, orm + :tickets: 3388 + + Fixed a regression regarding the :meth:`.MapperEvents.instrument_class` + event where its invocation was moved to be after the class manager's + instrumentation of the class, which is the opposite of what the + documentation for the event explicitly states. The rationale for the + switch was due to Declarative taking the step of setting up + the full "instrumentation manager" for a class before it was mapped + for the purpose of the new ``@declared_attr`` features + described in :ref:`feature_3150`, but the change was also made + against the classical use of :func:`.mapper` for consistency. + However, SQLSoup relies upon the instrumentation event happening + before any instrumentation under classical mapping. + The behavior is reverted in the case of classical and declarative + mapping, the latter implemented by using a simple memoization + without using class manager. + + .. change:: + :tags: bug, orm + :tickets: 3387 + + Fixed issue in new :meth:`.QueryEvents.before_compile` event where + changes made to the :class:`.Query` object's collection of entities + to load within the event would render in the SQL, but would not + be reflected during the loading process. + +.. changelog:: + :version: 1.0.2 + :released: April 24, 2015 + + .. change:: + :tags: bug, sql + :tickets: 3338, 3385 + + Fixed a regression that was incorrectly fixed in 1.0.0b4 + (hence becoming two regressions); reports that + SELECT statements would GROUP BY a label name and fail was misconstrued + that certain backends such as SQL Server should not be emitting + ORDER BY or GROUP BY on a simple label name at all; when in fact, + we had forgotten that 0.9 was already emitting ORDER BY on a simple + label name for all backends, as described in :ref:`migration_1068`, + even though 1.0 includes a rewrite of this logic as part of + :ticket:`2992`. As far + as emitting GROUP BY against a simple label, even Postgresql has + cases where it will raise an error even though the label to group + on should be apparent, so it is clear that GROUP BY should never + be rendered in this way automatically. + + In 1.0.2, SQL Server, Firebird and others will again emit ORDER BY on + a simple label name when passed a + :class:`.Label` construct that is also present in the columns clause. + Additionally, no backend will emit GROUP BY against the simple label + name only when passed a :class:`.Label` construct. + + .. change:: + :tags: bug, orm, declarative + :tickets: 3383 + + Fixed regression regarding the declarative ``__declare_first__`` + and ``__declare_last__`` accessors where these would no longer be + called on the superclass of the declarative base. + +.. changelog:: :version: 1.0.1 :released: April 23, 2015 @@ -340,6 +508,9 @@ GROUP BY expressions. The flag is also turned off defensively for the Firebird and Sybase dialects. + .. note:: this resolution was incorrect, please see version 1.0.2 + for a rework of this resolution. + .. change:: :tags: feature, schema :tickets: 3341 @@ -400,7 +571,7 @@ courtesy Thomas Grainger. .. change:: - :tags: change, ext, declarative + :tags: change, orm, declarative :tickets: 3331 Loosened some restrictions that were added to ``@declared_attr`` @@ -452,7 +623,7 @@ on compatibility concerns, see :doc:`/changelog/migration_10`. .. change:: - :tags: feature, extensions + :tags: feature, ext :tickets: 3054 Added a new extension suite :mod:`sqlalchemy.ext.baked`. This @@ -527,7 +698,7 @@ continued after the error raise occurred. .. change:: - :tags: bug, ext + :tags: bug, orm, declarative :tickets: 3219, 3240 Fixed bug where using an ``__abstract__`` mixin in the middle @@ -1280,7 +1451,7 @@ all transactional status and operations. .. change:: - :tags: bug, declarative + :tags: bug, orm, declarative :tickets: 2670 A relationship set up with :class:`.declared_attr` on @@ -1293,7 +1464,7 @@ :ref:`feature_3150` .. change:: - :tags: feature, declarative + :tags: feature, orm, declarative :tickets: 3150 The :class:`.declared_attr` construct has newly improved diff --git a/doc/build/changelog/migration_10.rst b/doc/build/changelog/migration_10.rst index 5707e7975..e8b2161d0 100644 --- a/doc/build/changelog/migration_10.rst +++ b/doc/build/changelog/migration_10.rst @@ -1531,7 +1531,7 @@ join into a subquery as a join target on SQLite. :ticket:`3008` -.. _change_3429: +.. _change_3249: Subqueries no longer applied to uselist=False joined eager loads ---------------------------------------------------------------- diff --git a/doc/build/conf.py b/doc/build/conf.py index 03a6c4d4e..46fc94386 100644 --- a/doc/build/conf.py +++ b/doc/build/conf.py @@ -106,9 +106,9 @@ copyright = u'2007-2015, the SQLAlchemy authors and contributors' # The short X.Y version. version = "1.0" # The full version, including alpha/beta/rc tags. -release = "1.0.1" +release = "1.0.2" -release_date = "April 23, 2015" +release_date = "April 24, 2015" site_base = os.environ.get("RTD_SITE_BASE", "http://www.sqlalchemy.org") site_adapter_template = "docs_adapter.mako" diff --git a/doc/build/requirements.txt b/doc/build/requirements.txt index 3f87e68ea..41a2f28cd 100644 --- a/doc/build/requirements.txt +++ b/doc/build/requirements.txt @@ -1,3 +1,3 @@ changelog>=0.3.4 sphinx-paramlinks>=0.2.2 -git+https://bitbucket.org/zzzeek/zzzeeksphinx.git +git+https://bitbucket.org/zzzeek/zzzeeksphinx.git@rel_20150425#egg=zzzeeksphinx diff --git a/lib/sqlalchemy/__init__.py b/lib/sqlalchemy/__init__.py index 3ef97ef90..153b10a2b 100644 --- a/lib/sqlalchemy/__init__.py +++ b/lib/sqlalchemy/__init__.py @@ -120,7 +120,7 @@ from .schema import ( from .inspection import inspect from .engine import create_engine, engine_from_config -__version__ = '1.0.1' +__version__ = '1.0.3' def __go(lcls): diff --git a/lib/sqlalchemy/dialects/firebird/base.py b/lib/sqlalchemy/dialects/firebird/base.py index e407e2f25..c34829cd3 100644 --- a/lib/sqlalchemy/dialects/firebird/base.py +++ b/lib/sqlalchemy/dialects/firebird/base.py @@ -394,8 +394,6 @@ class FBDialect(default.DefaultDialect): requires_name_normalize = True supports_empty_insert = False - supports_simple_order_by_label = False - statement_compiler = FBCompiler ddl_compiler = FBDDLCompiler preparer = FBIdentifierPreparer diff --git a/lib/sqlalchemy/dialects/mssql/base.py b/lib/sqlalchemy/dialects/mssql/base.py index 522e59b00..b073af6af 100644 --- a/lib/sqlalchemy/dialects/mssql/base.py +++ b/lib/sqlalchemy/dialects/mssql/base.py @@ -1417,7 +1417,6 @@ class MSDialect(default.DefaultDialect): use_scope_identity = True max_identifier_length = 128 schema_name = "dbo" - supports_simple_order_by_label = False colspecs = { sqltypes.DateTime: _MSDateTime, diff --git a/lib/sqlalchemy/dialects/sybase/base.py b/lib/sqlalchemy/dialects/sybase/base.py index 1baab6db4..ae0473a3e 100644 --- a/lib/sqlalchemy/dialects/sybase/base.py +++ b/lib/sqlalchemy/dialects/sybase/base.py @@ -435,7 +435,6 @@ class SybaseDialect(default.DefaultDialect): supports_native_boolean = False supports_unicode_binds = False postfetch_lastrowid = True - supports_simple_order_by_label = False colspecs = {} ischema_names = ischema_names diff --git a/lib/sqlalchemy/ext/associationproxy.py b/lib/sqlalchemy/ext/associationproxy.py index a74141973..d837aab52 100644 --- a/lib/sqlalchemy/ext/associationproxy.py +++ b/lib/sqlalchemy/ext/associationproxy.py @@ -365,13 +365,17 @@ class AssociationProxy(interfaces.InspectionAttrInfo): operators of the underlying proxied attributes. """ - - if self._value_is_scalar: - value_expr = getattr( - self.target_class, self.value_attr).has(criterion, **kwargs) + if self._target_is_object: + if self._value_is_scalar: + value_expr = getattr( + self.target_class, self.value_attr).has( + criterion, **kwargs) + else: + value_expr = getattr( + self.target_class, self.value_attr).any( + criterion, **kwargs) else: - value_expr = getattr( - self.target_class, self.value_attr).any(criterion, **kwargs) + value_expr = criterion # check _value_is_scalar here, otherwise # we're scalar->scalar - call .any() so that diff --git a/lib/sqlalchemy/ext/automap.py b/lib/sqlalchemy/ext/automap.py index ca550ded6..448d8492e 100644 --- a/lib/sqlalchemy/ext/automap.py +++ b/lib/sqlalchemy/ext/automap.py @@ -67,7 +67,7 @@ asking it to reflect the schema and produce mappings:: Above, calling :meth:`.AutomapBase.prepare` while passing along the :paramref:`.AutomapBase.prepare.reflect` parameter indicates that the :meth:`.MetaData.reflect` method will be called on this declarative base -classes' :class:`.MetaData` collection; then, each viable +classes' :class:`.MetaData` collection; then, each **viable** :class:`.Table` within the :class:`.MetaData` will get a new mapped class generated automatically. The :class:`.ForeignKeyConstraint` objects which link the various tables together will be used to produce new, bidirectional @@ -76,6 +76,12 @@ follow along a default naming scheme that we can customize. At this point, our basic mapping consisting of related ``User`` and ``Address`` classes is ready to use in the traditional way. +.. note:: By **viable**, we mean that for a table to be mapped, it must + specify a primary key. Additionally, if the table is detected as being + a pure association table between two other tables, it will not be directly + mapped and will instead be configured as a many-to-many table between + the mappings for the two referring tables. + Generating Mappings from an Existing MetaData ============================================= diff --git a/lib/sqlalchemy/ext/declarative/api.py b/lib/sqlalchemy/ext/declarative/api.py index 713ea0aba..3d46bd4cb 100644 --- a/lib/sqlalchemy/ext/declarative/api.py +++ b/lib/sqlalchemy/ext/declarative/api.py @@ -163,21 +163,16 @@ class declared_attr(interfaces._MappedAttribute, property): self._cascading = cascading def __get__(desc, self, cls): - # use the ClassManager for memoization of values. This is better than - # adding yet another attribute onto the class, or using weakrefs - # here which are slow and take up memory. It also allows us to - # warn for non-mapped use of declared_attr. - - manager = attributes.manager_of_class(cls) - if manager is None: - util.warn( - "Unmanaged access of declarative attribute %s from " - "non-mapped class %s" % - (desc.fget.__name__, cls.__name__)) + reg = cls.__dict__.get('_sa_declared_attr_reg', None) + if reg is None: + manager = attributes.manager_of_class(cls) + if manager is None: + util.warn( + "Unmanaged access of declarative attribute %s from " + "non-mapped class %s" % + (desc.fget.__name__, cls.__name__)) return desc.fget(cls) - reg = manager.info.get('declared_attr_reg', None) - if reg is None: return desc.fget(cls) elif desc in reg: diff --git a/lib/sqlalchemy/ext/declarative/base.py b/lib/sqlalchemy/ext/declarative/base.py index 7d4020b24..57eb54f63 100644 --- a/lib/sqlalchemy/ext/declarative/base.py +++ b/lib/sqlalchemy/ext/declarative/base.py @@ -39,7 +39,7 @@ def _resolve_for_abstract(cls): if cls is object: return None - if _get_immediate_cls_attr(cls, '__abstract__'): + if _get_immediate_cls_attr(cls, '__abstract__', strict=True): for sup in cls.__bases__: sup = _resolve_for_abstract(sup) if sup is not None: @@ -50,7 +50,7 @@ def _resolve_for_abstract(cls): return cls -def _get_immediate_cls_attr(cls, attrname): +def _get_immediate_cls_attr(cls, attrname, strict=False): """return an attribute of the class that is either present directly on the class, e.g. not on a superclass, or is from a superclass but this superclass is a mixin, that is, not a descendant of @@ -66,11 +66,12 @@ def _get_immediate_cls_attr(cls, attrname): for base in cls.__mro__: _is_declarative_inherits = hasattr(base, '_decl_class_registry') - if attrname in base.__dict__: - value = getattr(base, attrname) - if (base is cls or - (base in cls.__bases__ and not _is_declarative_inherits)): - return value + if attrname in base.__dict__ and ( + base is cls or + ((base in cls.__bases__ if strict else True) + and not _is_declarative_inherits) + ): + return getattr(base, attrname) else: return None @@ -81,7 +82,7 @@ def _as_declarative(cls, classname, dict_): from .api import declared_attr declarative_props = (declared_attr, util.classproperty) - if _get_immediate_cls_attr(cls, '__abstract__'): + if _get_immediate_cls_attr(cls, '__abstract__', strict=True): return _MapperConfig.setup_mapping(cls, classname, dict_) @@ -92,7 +93,7 @@ class _MapperConfig(object): @classmethod def setup_mapping(cls, cls_, classname, dict_): defer_map = _get_immediate_cls_attr( - cls_, '_sa_decl_prepare_nocascade') or \ + cls_, '_sa_decl_prepare_nocascade', strict=True) or \ hasattr(cls_, '_sa_decl_prepare') if defer_map: @@ -114,10 +115,10 @@ class _MapperConfig(object): self.column_copies = {} self._setup_declared_events() - # register up front, so that @declared_attr can memoize - # function evaluations in .info - manager = instrumentation.register_class(self.cls) - manager.info['declared_attr_reg'] = {} + # temporary registry. While early 1.0 versions + # set up the ClassManager here, by API contract + # we can't do that until there's a mapper. + self.cls._sa_declared_attr_reg = {} self._scan_attributes() @@ -158,7 +159,8 @@ class _MapperConfig(object): for base in cls.__mro__: class_mapped = base is not cls and \ _declared_mapping_info(base) is not None and \ - not _get_immediate_cls_attr(base, '_sa_decl_prepare_nocascade') + not _get_immediate_cls_attr( + base, '_sa_decl_prepare_nocascade', strict=True) if not class_mapped and base is not cls: self._produce_column_copies(base) @@ -412,7 +414,7 @@ class _MapperConfig(object): continue if _declared_mapping_info(c) is not None and \ not _get_immediate_cls_attr( - c, '_sa_decl_prepare_nocascade'): + c, '_sa_decl_prepare_nocascade', strict=True): self.inherits = c break else: @@ -527,7 +529,7 @@ class _MapperConfig(object): self.local_table, **self.mapper_args ) - del mp_.class_manager.info['declared_attr_reg'] + del self.cls._sa_declared_attr_reg return mp_ diff --git a/lib/sqlalchemy/orm/mapper.py b/lib/sqlalchemy/orm/mapper.py index 924130874..468846d40 100644 --- a/lib/sqlalchemy/orm/mapper.py +++ b/lib/sqlalchemy/orm/mapper.py @@ -1096,8 +1096,6 @@ class Mapper(InspectionAttr): """ - # when using declarative as of 1.0, the register_class has - # already happened from within declarative. manager = attributes.manager_of_class(self.class_) if self.non_primary: @@ -1120,14 +1118,20 @@ class Mapper(InspectionAttr): "create a non primary Mapper. clear_mappers() will " "remove *all* current mappers from all classes." % self.class_) - - if manager is None: - manager = instrumentation.register_class(self.class_) + # else: + # a ClassManager may already exist as + # ClassManager.instrument_attribute() creates + # new managers for each subclass if they don't yet exist. _mapper_registry[self] = True + # note: this *must be called before instrumentation.register_class* + # to maintain the documented behavior of instrument_class self.dispatch.instrument_class(self, self.class_) + if manager is None: + manager = instrumentation.register_class(self.class_) + self.class_manager = manager manager.mapper = self diff --git a/lib/sqlalchemy/orm/persistence.py b/lib/sqlalchemy/orm/persistence.py index 34c37dab6..c3974ed6d 100644 --- a/lib/sqlalchemy/orm/persistence.py +++ b/lib/sqlalchemy/orm/persistence.py @@ -452,12 +452,11 @@ def _collect_update_commands( value = state_dict[propkey] col = propkey_to_col[propkey] - if not state.manager[propkey].impl.is_equal( + if isinstance(value, sql.ClauseElement): + value_params[col] = value + elif not state.manager[propkey].impl.is_equal( value, state.committed_state[propkey]): - if isinstance(value, sql.ClauseElement): - value_params[col] = value - else: - params[col.key] = value + params[col.key] = value if update_version_id is not None and \ mapper.version_id_col in mapper._cols_by_table[table]: diff --git a/lib/sqlalchemy/orm/query.py b/lib/sqlalchemy/orm/query.py index 2bd68899b..f4b04b078 100644 --- a/lib/sqlalchemy/orm/query.py +++ b/lib/sqlalchemy/orm/query.py @@ -2528,7 +2528,7 @@ class Query(object): close_with_result=True) result = conn.execute(querycontext.statement, self._params) - return loading.instances(self, result, querycontext) + return loading.instances(querycontext.query, result, querycontext) @property def column_descriptions(self): @@ -2577,6 +2577,7 @@ class Query(object): 'expr': ent.expr, 'entity': ent.entity_zero.entity if ent.entity_zero is not None + and not inspect(ent.entity_zero).is_selectable else None } for ent in self._entities @@ -3620,7 +3621,6 @@ class _ColumnEntity(_QueryEntity): if 'parententity' in elem._annotations and actual_froms.intersection(elem._from_objects) ]) - if self.entities: self.entity_zero = self.entities[0] elif self.namespace is not None: diff --git a/lib/sqlalchemy/orm/relationships.py b/lib/sqlalchemy/orm/relationships.py index d1c5e5e51..da0730f46 100644 --- a/lib/sqlalchemy/orm/relationships.py +++ b/lib/sqlalchemy/orm/relationships.py @@ -195,7 +195,7 @@ class RelationshipProperty(StrategizedProperty): The :paramref:`~.relationship.secondary` keyword argument is typically applied in the case where the intermediary :class:`.Table` - is not otherwise exprssed in any direct class mapping. If the + is not otherwise expressed in any direct class mapping. If the "secondary" table is also explicitly mapped elsewhere (e.g. as in :ref:`association_pattern`), one should consider applying the :paramref:`~.relationship.viewonly` flag so that this diff --git a/lib/sqlalchemy/sql/compiler.py b/lib/sqlalchemy/sql/compiler.py index 91b677a0e..c9c7fd2a1 100644 --- a/lib/sqlalchemy/sql/compiler.py +++ b/lib/sqlalchemy/sql/compiler.py @@ -1133,7 +1133,7 @@ class SQLCompiler(Compiled): anonname = name.apply_map(self.anon_map) - if len(anonname) > self.label_length: + if len(anonname) > self.label_length - 6: counter = self.truncated_names.get(ident_class, 1) truncname = anonname[0:max(self.label_length - 6, 0)] + \ "_" + hex(counter)[2:] diff --git a/lib/sqlalchemy/sql/ddl.py b/lib/sqlalchemy/sql/ddl.py index a0841b13c..71018f132 100644 --- a/lib/sqlalchemy/sql/ddl.py +++ b/lib/sqlalchemy/sql/ddl.py @@ -711,8 +711,11 @@ class SchemaGenerator(DDLBase): seq_coll = [s for s in metadata._sequences.values() if s.column is None and self._can_create_sequence(s)] + event_collection = [ + t for (t, fks) in collection if t is not None + ] metadata.dispatch.before_create(metadata, self.connection, - tables=collection, + tables=event_collection, checkfirst=self.checkfirst, _ddl_runner=self) @@ -730,7 +733,7 @@ class SchemaGenerator(DDLBase): self.traverse_single(fkc) metadata.dispatch.after_create(metadata, self.connection, - tables=collection, + tables=event_collection, checkfirst=self.checkfirst, _ddl_runner=self) @@ -804,7 +807,7 @@ class SchemaDropper(DDLBase): try: unsorted_tables = [t for t in tables if self._can_drop_table(t)] - collection = reversed( + collection = list(reversed( sort_tables_and_constraints( unsorted_tables, filter_fn=lambda constraint: False @@ -812,7 +815,7 @@ class SchemaDropper(DDLBase): or constraint.name is None else None ) - ) + )) except exc.CircularDependencyError as err2: if not self.dialect.supports_alter: util.warn( @@ -854,8 +857,12 @@ class SchemaDropper(DDLBase): if s.column is None and self._can_drop_sequence(s) ] + event_collection = [ + t for (t, fks) in collection if t is not None + ] + metadata.dispatch.before_drop( - metadata, self.connection, tables=collection, + metadata, self.connection, tables=event_collection, checkfirst=self.checkfirst, _ddl_runner=self) for table, fkcs in collection: @@ -870,7 +877,7 @@ class SchemaDropper(DDLBase): self.traverse_single(seq, drop_ok=True) metadata.dispatch.after_drop( - metadata, self.connection, tables=collection, + metadata, self.connection, tables=event_collection, checkfirst=self.checkfirst, _ddl_runner=self) def _can_drop_table(self, table): diff --git a/lib/sqlalchemy/sql/elements.py b/lib/sqlalchemy/sql/elements.py index 6ee4053a7..a178ed99a 100644 --- a/lib/sqlalchemy/sql/elements.py +++ b/lib/sqlalchemy/sql/elements.py @@ -3724,6 +3724,16 @@ def _literal_as_label_reference(element): elif hasattr(element, '__clause_element__'): element = element.__clause_element__() + return _literal_as_text(element) + + +def _literal_and_labels_as_label_reference(element): + if isinstance(element, util.string_types): + return _textual_label_reference(element) + + elif hasattr(element, '__clause_element__'): + element = element.__clause_element__() + if isinstance(element, ColumnElement) and \ element._order_by_label_element is not None: return _label_reference(element) diff --git a/lib/sqlalchemy/sql/selectable.py b/lib/sqlalchemy/sql/selectable.py index 7d8c885ae..245c54817 100644 --- a/lib/sqlalchemy/sql/selectable.py +++ b/lib/sqlalchemy/sql/selectable.py @@ -16,7 +16,7 @@ from .elements import _clone, \ _literal_as_text, _interpret_as_column_or_from, _expand_cloned,\ _select_iterables, _anonymous_label, _clause_element_as_expr,\ _cloned_intersection, _cloned_difference, True_, \ - _literal_as_label_reference + _literal_as_label_reference, _literal_and_labels_as_label_reference from .base import Immutable, Executable, _generative, \ ColumnCollection, ColumnSet, _from_objects, Generative from . import type_api @@ -1723,7 +1723,7 @@ class GenerativeSelect(SelectBase): if order_by is not None: self._order_by_clause = ClauseList( *util.to_list(order_by), - _literal_as_text=_literal_as_label_reference) + _literal_as_text=_literal_and_labels_as_label_reference) if group_by is not None: self._group_by_clause = ClauseList( *util.to_list(group_by), @@ -1912,7 +1912,8 @@ class GenerativeSelect(SelectBase): if getattr(self, '_order_by_clause', None) is not None: clauses = list(self._order_by_clause) + list(clauses) self._order_by_clause = ClauseList( - *clauses, _literal_as_text=_literal_as_label_reference) + *clauses, + _literal_as_text=_literal_and_labels_as_label_reference) def append_group_by(self, *clauses): """Append the given GROUP BY criterion applied to this selectable. diff --git a/lib/sqlalchemy/sql/type_api.py b/lib/sqlalchemy/sql/type_api.py index 4660850bd..a55eed981 100644 --- a/lib/sqlalchemy/sql/type_api.py +++ b/lib/sqlalchemy/sql/type_api.py @@ -128,6 +128,33 @@ class TypeEngine(Visitable): """ + def compare_against_backend(self, dialect, conn_type): + """Compare this type against the given backend type. + + This function is currently not implemented for SQLAlchemy + types, and for all built in types will return ``None``. However, + it can be implemented by a user-defined type + where it can be consumed by schema comparison tools such as + Alembic autogenerate. + + A future release of SQLAlchemy will potentially impement this method + for builtin types as well. + + The function should return True if this type is equivalent to the + given type; the type is typically reflected from the database + so should be database specific. The dialect in use is also + passed. It can also return False to assert that the type is + not equivalent. + + :param dialect: a :class:`.Dialect` that is involved in the comparison. + + :param conn_type: the type object reflected from the backend. + + .. versionadded:: 1.0.3 + + """ + return None + def copy_value(self, value): return value diff --git a/lib/sqlalchemy/sql/util.py b/lib/sqlalchemy/sql/util.py index bec5b5824..8f502fc86 100644 --- a/lib/sqlalchemy/sql/util.py +++ b/lib/sqlalchemy/sql/util.py @@ -16,7 +16,8 @@ from itertools import chain from collections import deque from .elements import BindParameter, ColumnClause, ColumnElement, \ - Null, UnaryExpression, literal_column, Label, _label_reference + Null, UnaryExpression, literal_column, Label, _label_reference, \ + _textual_label_reference from .selectable import ScalarSelect, Join, FromClause, FromGrouping from .schema import Column @@ -163,6 +164,8 @@ def unwrap_order_by(clause): ): if isinstance(t, _label_reference): t = t.element + if isinstance(t, (_textual_label_reference)): + continue cols.add(t) else: for c in t.get_children(): diff --git a/lib/sqlalchemy/testing/__init__.py b/lib/sqlalchemy/testing/__init__.py index bf83e9673..adfbe85e3 100644 --- a/lib/sqlalchemy/testing/__init__.py +++ b/lib/sqlalchemy/testing/__init__.py @@ -24,7 +24,8 @@ from .assertions import emits_warning, emits_warning_on, uses_deprecated, \ AssertsExecutionResults, expect_deprecated, expect_warnings from .util import run_as_contextmanager, rowset, fail, \ - provide_metadata, adict, force_drop_names + provide_metadata, adict, force_drop_names, \ + teardown_events crashes = skip diff --git a/lib/sqlalchemy/testing/mock.py b/lib/sqlalchemy/testing/mock.py index be83693cc..c836bb407 100644 --- a/lib/sqlalchemy/testing/mock.py +++ b/lib/sqlalchemy/testing/mock.py @@ -11,10 +11,10 @@ from __future__ import absolute_import from ..util import py33 if py33: - from unittest.mock import MagicMock, Mock, call, patch + from unittest.mock import MagicMock, Mock, call, patch, ANY else: try: - from mock import MagicMock, Mock, call, patch + from mock import MagicMock, Mock, call, patch, ANY except ImportError: raise ImportError( "SQLAlchemy's test suite requires the " diff --git a/lib/sqlalchemy/testing/util.py b/lib/sqlalchemy/testing/util.py index 1842e58a5..e9437948a 100644 --- a/lib/sqlalchemy/testing/util.py +++ b/lib/sqlalchemy/testing/util.py @@ -267,3 +267,14 @@ def drop_all_tables(engine, inspector, schema=None, include_names=None): ForeignKeyConstraint( [tb.c.x], [tb.c.y], name=fkc) )) + + +def teardown_events(event_cls): + @decorator + def decorate(fn, *arg, **kw): + try: + return fn(*arg, **kw) + finally: + event_cls._clear() + return decorate + diff --git a/lib/sqlalchemy/util/_collections.py b/lib/sqlalchemy/util/_collections.py index 4fb12d71b..db2c21949 100644 --- a/lib/sqlalchemy/util/_collections.py +++ b/lib/sqlalchemy/util/_collections.py @@ -743,15 +743,16 @@ _property_getters = PopulateDict( def unique_list(seq, hashfunc=None): - seen = {} + seen = set() + seen_add = seen.add if not hashfunc: return [x for x in seq if x not in seen - and not seen.__setitem__(x, True)] + and not seen_add(x)] else: return [x for x in seq if hashfunc(x) not in seen - and not seen.__setitem__(hashfunc(x), True)] + and not seen_add(hashfunc(x))] class UniqueAppender(object): diff --git a/test/engine/test_ddlevents.py b/test/engine/test_ddlevents.py index 18300179c..8beb255eb 100644 --- a/test/engine/test_ddlevents.py +++ b/test/engine/test_ddlevents.py @@ -11,38 +11,10 @@ from sqlalchemy import testing from sqlalchemy.testing import engines from sqlalchemy.testing import AssertsCompiledSQL, eq_ from sqlalchemy.testing import fixtures +from sqlalchemy.testing import mock class DDLEventTest(fixtures.TestBase): - class Canary(object): - def __init__(self, schema_item, bind): - self.state = None - self.schema_item = schema_item - self.bind = bind - - def before_create(self, schema_item, bind, **kw): - assert self.state is None - assert schema_item is self.schema_item - assert bind is self.bind - self.state = 'before-create' - - def after_create(self, schema_item, bind, **kw): - assert self.state in ('before-create', 'skipped') - assert schema_item is self.schema_item - assert bind is self.bind - self.state = 'after-create' - - def before_drop(self, schema_item, bind, **kw): - assert self.state is None - assert schema_item is self.schema_item - assert bind is self.bind - self.state = 'before-drop' - - def after_drop(self, schema_item, bind, **kw): - assert self.state in ('before-drop', 'skipped') - assert schema_item is self.schema_item - assert bind is self.bind - self.state = 'after-drop' def setup(self): self.bind = engines.mock_engine() @@ -51,128 +23,276 @@ class DDLEventTest(fixtures.TestBase): def test_table_create_before(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'before_create', canary.before_create) table.create(bind) - assert canary.state == 'before-create' table.drop(bind) - assert canary.state == 'before-create' + eq_( + canary.mock_calls, + [ + mock.call.before_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY) + ] + ) def test_table_create_after(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'after_create', canary.after_create) - canary.state = 'skipped' table.create(bind) - assert canary.state == 'after-create' table.drop(bind) - assert canary.state == 'after-create' + eq_( + canary.mock_calls, + [ + mock.call.after_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY) + ] + ) def test_table_create_both(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'before_create', canary.before_create) event.listen(table, 'after_create', canary.after_create) table.create(bind) - assert canary.state == 'after-create' table.drop(bind) - assert canary.state == 'after-create' + eq_( + canary.mock_calls, + [ + mock.call.before_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + mock.call.after_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY) + ] + ) def test_table_drop_before(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'before_drop', canary.before_drop) table.create(bind) - assert canary.state is None table.drop(bind) - assert canary.state == 'before-drop' + eq_( + canary.mock_calls, + [ + mock.call.before_drop( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + ] + ) def test_table_drop_after(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'after_drop', canary.after_drop) table.create(bind) - assert canary.state is None canary.state = 'skipped' table.drop(bind) - assert canary.state == 'after-drop' + eq_( + canary.mock_calls, + [ + mock.call.after_drop( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + ] + ) def test_table_drop_both(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'before_drop', canary.before_drop) event.listen(table, 'after_drop', canary.after_drop) table.create(bind) - assert canary.state is None table.drop(bind) - assert canary.state == 'after-drop' + eq_( + canary.mock_calls, + [ + mock.call.before_drop( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + mock.call.after_drop( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + ] + ) def test_table_all(self): table, bind = self.table, self.bind - canary = self.Canary(table, bind) + canary = mock.Mock() event.listen(table, 'before_create', canary.before_create) event.listen(table, 'after_create', canary.after_create) event.listen(table, 'before_drop', canary.before_drop) event.listen(table, 'after_drop', canary.after_drop) - assert canary.state is None table.create(bind) - assert canary.state == 'after-create' - canary.state = None table.drop(bind) - assert canary.state == 'after-drop' + eq_( + canary.mock_calls, + [ + mock.call.before_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + mock.call.after_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + mock.call.before_drop( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + mock.call.after_drop( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + ] + ) - def test_table_create_before(self): + def test_metadata_create_before(self): metadata, bind = self.metadata, self.bind - canary = self.Canary(metadata, bind) + canary = mock.Mock() event.listen(metadata, 'before_create', canary.before_create) metadata.create_all(bind) - assert canary.state == 'before-create' metadata.drop_all(bind) - assert canary.state == 'before-create' + eq_( + canary.mock_calls, + [ + mock.call.before_create( + # checkfirst is False because of the MockConnection + # used in the current testing strategy. + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + ] + ) def test_metadata_create_after(self): metadata, bind = self.metadata, self.bind - canary = self.Canary(metadata, bind) + canary = mock.Mock() event.listen(metadata, 'after_create', canary.after_create) - canary.state = 'skipped' metadata.create_all(bind) - assert canary.state == 'after-create' metadata.drop_all(bind) - assert canary.state == 'after-create' + eq_( + canary.mock_calls, + [ + mock.call.after_create( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + ] + ) def test_metadata_create_both(self): metadata, bind = self.metadata, self.bind - canary = self.Canary(metadata, bind) + canary = mock.Mock() event.listen(metadata, 'before_create', canary.before_create) event.listen(metadata, 'after_create', canary.after_create) metadata.create_all(bind) - assert canary.state == 'after-create' metadata.drop_all(bind) - assert canary.state == 'after-create' + eq_( + canary.mock_calls, + [ + mock.call.before_create( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + mock.call.after_create( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + ] + ) + + def test_metadata_drop_before(self): + metadata, bind = self.metadata, self.bind + canary = mock.Mock() + event.listen(metadata, 'before_drop', canary.before_drop) + + metadata.create_all(bind) + metadata.drop_all(bind) + eq_( + canary.mock_calls, + [ + mock.call.before_drop( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + ] + ) + + def test_metadata_drop_after(self): + metadata, bind = self.metadata, self.bind + canary = mock.Mock() + event.listen(metadata, 'after_drop', canary.after_drop) + + metadata.create_all(bind) + metadata.drop_all(bind) + eq_( + canary.mock_calls, + [ + mock.call.after_drop( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + ] + ) + + def test_metadata_drop_both(self): + metadata, bind = self.metadata, self.bind + canary = mock.Mock() + + event.listen(metadata, 'before_drop', canary.before_drop) + event.listen(metadata, 'after_drop', canary.after_drop) + + metadata.create_all(bind) + metadata.drop_all(bind) + eq_( + canary.mock_calls, + [ + mock.call.before_drop( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + mock.call.after_drop( + metadata, self.bind, checkfirst=False, + tables=list(metadata.tables.values()), + _ddl_runner=mock.ANY), + ] + ) def test_metadata_table_isolation(self): - metadata, table, bind = self.metadata, self.table, self.bind - table_canary = self.Canary(table, bind) + metadata, table = self.metadata, self.table + table_canary = mock.Mock() + metadata_canary = mock.Mock() event.listen(table, 'before_create', table_canary.before_create) - metadata_canary = self.Canary(metadata, bind) event.listen(metadata, 'before_create', metadata_canary.before_create) self.table.create(self.bind) - assert metadata_canary.state == None + eq_( + table_canary.mock_calls, + [ + mock.call.before_create( + table, self.bind, checkfirst=False, + _ddl_runner=mock.ANY, _is_metadata_operation=mock.ANY), + ] + ) + eq_( + metadata_canary.mock_calls, + [] + ) def test_append_listener(self): metadata, table, bind = self.metadata, self.table, self.bind diff --git a/test/ext/declarative/test_basic.py b/test/ext/declarative/test_basic.py index 3fac39cac..ab0de801c 100644 --- a/test/ext/declarative/test_basic.py +++ b/test/ext/declarative/test_basic.py @@ -13,7 +13,10 @@ from sqlalchemy.orm import relationship, create_session, class_mapper, \ column_property, composite, Session, properties from sqlalchemy.util import with_metaclass from sqlalchemy.ext.declarative import declared_attr, synonym_for -from sqlalchemy.testing import fixtures +from sqlalchemy.testing import fixtures, mock +from sqlalchemy.orm.events import MapperEvents +from sqlalchemy.orm import mapper +from sqlalchemy import event Base = None @@ -1671,6 +1674,32 @@ class DeclarativeTest(DeclarativeTestBase): )) ) + @testing.teardown_events(MapperEvents) + def test_instrument_class_before_instrumentation(self): + # test #3388 + + canary = mock.Mock() + + @event.listens_for(mapper, "instrument_class") + def instrument_class(mp, cls): + canary.instrument_class(mp, cls) + + @event.listens_for(object, "class_instrument") + def class_instrument(cls): + canary.class_instrument(cls) + + class Test(Base): + __tablename__ = 'test' + id = Column(Integer, primary_key=True) + # MARKMARK + eq_( + canary.mock_calls, + [ + mock.call.instrument_class(Test.__mapper__, Test), + mock.call.class_instrument(Test) + ] + ) + def _produce_test(inline, stringbased): diff --git a/test/ext/declarative/test_inheritance.py b/test/ext/declarative/test_inheritance.py index 2ecee99fd..3e6980190 100644 --- a/test/ext/declarative/test_inheritance.py +++ b/test/ext/declarative/test_inheritance.py @@ -1451,4 +1451,5 @@ class ConcreteExtensionConfigTest( "actual_documents.send_method AS send_method, " "actual_documents.id AS id, 'actual' AS type " "FROM actual_documents) AS pjoin" - )
\ No newline at end of file + ) + diff --git a/test/ext/declarative/test_mixin.py b/test/ext/declarative/test_mixin.py index 45b881671..b9e40421c 100644 --- a/test/ext/declarative/test_mixin.py +++ b/test/ext/declarative/test_mixin.py @@ -9,7 +9,8 @@ from sqlalchemy.orm import relationship, create_session, class_mapper, \ configure_mappers, clear_mappers, \ deferred, column_property, Session, base as orm_base from sqlalchemy.util import classproperty -from sqlalchemy.ext.declarative import declared_attr +from sqlalchemy.ext.declarative import declared_attr, declarative_base +from sqlalchemy.orm import events as orm_events from sqlalchemy.testing import fixtures, mock from sqlalchemy.testing.util import gc_collect @@ -438,6 +439,90 @@ class DeclarativeMixinTest(DeclarativeTestBase): eq_(MyModel.__table__.kwargs, {'mysql_engine': 'InnoDB'}) + @testing.teardown_events(orm_events.MapperEvents) + def test_declare_first_mixin(self): + canary = mock.Mock() + + class MyMixin(object): + @classmethod + def __declare_first__(cls): + canary.declare_first__(cls) + + @classmethod + def __declare_last__(cls): + canary.declare_last__(cls) + + class MyModel(Base, MyMixin): + __tablename__ = 'test' + id = Column(Integer, primary_key=True) + + configure_mappers() + + eq_( + canary.mock_calls, + [ + mock.call.declare_first__(MyModel), + mock.call.declare_last__(MyModel), + ] + ) + + @testing.teardown_events(orm_events.MapperEvents) + def test_declare_first_base(self): + canary = mock.Mock() + + class MyMixin(object): + @classmethod + def __declare_first__(cls): + canary.declare_first__(cls) + + @classmethod + def __declare_last__(cls): + canary.declare_last__(cls) + + class Base(MyMixin): + pass + Base = declarative_base(cls=Base) + + class MyModel(Base): + __tablename__ = 'test' + id = Column(Integer, primary_key=True) + + configure_mappers() + + eq_( + canary.mock_calls, + [ + mock.call.declare_first__(MyModel), + mock.call.declare_last__(MyModel), + ] + ) + + @testing.teardown_events(orm_events.MapperEvents) + def test_declare_first_direct(self): + canary = mock.Mock() + + class MyOtherModel(Base): + __tablename__ = 'test2' + id = Column(Integer, primary_key=True) + + @classmethod + def __declare_first__(cls): + canary.declare_first__(cls) + + @classmethod + def __declare_last__(cls): + canary.declare_last__(cls) + + configure_mappers() + + eq_( + canary.mock_calls, + [ + mock.call.declare_first__(MyOtherModel), + mock.call.declare_last__(MyOtherModel) + ] + ) + def test_mapper_args_declared_attr(self): class ComputedMapperArgs: diff --git a/test/ext/test_associationproxy.py b/test/ext/test_associationproxy.py index 34c1a8322..8fb335b06 100644 --- a/test/ext/test_associationproxy.py +++ b/test/ext/test_associationproxy.py @@ -1089,7 +1089,8 @@ class ComparatorTest(fixtures.MappedTest, AssertsCompiledSQL): def define_tables(cls, metadata): Table('userkeywords', metadata, Column('keyword_id', Integer, ForeignKey('keywords.id'), primary_key=True), - Column('user_id', Integer, ForeignKey('users.id')) + Column('user_id', Integer, ForeignKey('users.id')), + Column('value', String(50)) ) Table('users', metadata, Column('id', Integer, @@ -1128,6 +1129,9 @@ class ComparatorTest(fixtures.MappedTest, AssertsCompiledSQL): # nonuselist singular_value = association_proxy('singular', 'value') + # o2m -> scalar + singular_collection = association_proxy('user_keywords', 'value') + class Keyword(cls.Comparable): def __init__(self, keyword): self.keyword = keyword @@ -1195,8 +1199,9 @@ class ComparatorTest(fixtures.MappedTest, AssertsCompiledSQL): for jj in words[(ii % len(words)):((ii + 3) % len(words))]: k = Keyword(jj) user.keywords.append(k) - if ii % 3 == None: + if ii % 2 == 0: user.singular.keywords.append(k) + user.user_keywords[-1].value = "singular%d" % ii orphan = Keyword('orphan') orphan.user_keyword = UserKeyword(keyword=orphan, user=None) @@ -1213,6 +1218,27 @@ class ComparatorTest(fixtures.MappedTest, AssertsCompiledSQL): def _equivalent(self, q_proxy, q_direct): eq_(q_proxy.all(), q_direct.all()) + def test_filter_any_criterion_ul_scalar(self): + UserKeyword, User = self.classes.UserKeyword, self.classes.User + + q1 = self.session.query(User).filter( + User.singular_collection.any(UserKeyword.value == 'singular8')) + self.assert_compile( + q1, + "SELECT users.id AS users_id, users.name AS users_name, " + "users.singular_id AS users_singular_id " + "FROM users " + "WHERE EXISTS (SELECT 1 " + "FROM userkeywords " + "WHERE users.id = userkeywords.user_id AND " + "userkeywords.value = :value_1)", + checkparams={'value_1': 'singular8'} + ) + + q2 = self.session.query(User).filter( + User.user_keywords.any(UserKeyword.value == 'singular8')) + self._equivalent(q1, q2) + def test_filter_any_kwarg_ul_nul(self): UserKeyword, User = self.classes.UserKeyword, self.classes.User diff --git a/test/orm/test_events.py b/test/orm/test_events.py index 179f914fc..51a0bb3a2 100644 --- a/test/orm/test_events.py +++ b/test/orm/test_events.py @@ -189,13 +189,13 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): ] ) - def test_merge(self): users, User = self.tables.users, self.classes.User mapper(User, users) canary = [] + def load(obj, ctx): canary.append('load') event.listen(mapper, 'load', load) @@ -212,6 +212,7 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): s.query(User).order_by(User.id).first() eq_(canary, ['load', 'load', 'load']) + def test_inheritance(self): users, addresses, User = (self.tables.users, self.tables.addresses, @@ -385,6 +386,43 @@ class MapperEventsTest(_RemoveListeners, _fixtures.FixtureTest): mapper(Address, addresses) eq_(canary, [User, Address]) + def test_instrument_class_precedes_class_instrumentation(self): + users = self.tables.users + + class MyClass(object): + pass + + canary = Mock() + + def my_init(self): + canary.init() + + # mapper level event + @event.listens_for(mapper, "instrument_class") + def instrument_class(mp, class_): + canary.instrument_class(class_) + class_.__init__ = my_init + + # instrumentationmanager event + @event.listens_for(object, "class_instrument") + def class_instrument(class_): + canary.class_instrument(class_) + + mapper(MyClass, users) + + m1 = MyClass() + assert attributes.instance_state(m1) + + eq_( + [ + call.instrument_class(MyClass), + call.class_instrument(MyClass), + call.init() + ], + canary.mock_calls + ) + + class DeclarativeEventListenTest(_RemoveListeners, fixtures.DeclarativeMappedTest): run_setup_classes = "each" run_deletes = None @@ -1886,7 +1924,6 @@ class SessionExtensionTest(_fixtures.FixtureTest): class QueryEventsTest( _RemoveListeners, _fixtures.FixtureTest, AssertsCompiledSQL): - run_inserts = None __dialect__ = 'default' @classmethod @@ -1917,3 +1954,25 @@ class QueryEventsTest( checkparams={'id_2': 10, 'id_1': 7} ) + def test_alters_entities(self): + User = self.classes.User + + @event.listens_for(query.Query, "before_compile", retval=True) + def fn(query): + return query.add_columns(User.name) + + s = Session() + + q = s.query(User.id, ).filter_by(id=7) + self.assert_compile( + q, + "SELECT users.id AS users_id, users.name AS users_name " + "FROM users " + "WHERE users.id = :id_1", + checkparams={'id_1': 7} + ) + eq_( + q.all(), + [(7, 'jack')] + ) + diff --git a/test/orm/test_query.py b/test/orm/test_query.py index 4021cdfbb..cb428469e 100644 --- a/test/orm/test_query.py +++ b/test/orm/test_query.py @@ -73,6 +73,7 @@ class RowTupleTest(QueryTest): fn = func.count(User.id) name_label = User.name.label('uname') bundle = Bundle('b1', User.id, User.name) + cte = sess.query(User.id).cte() for q, asserted in [ ( sess.query(User), @@ -123,6 +124,26 @@ class RowTupleTest(QueryTest): ] ), ( + sess.query(cte), + [ + { + 'aliased': False, + 'expr': cte.c.id, 'type': cte.c.id.type, + 'name': 'id', 'entity': None + }] + ), + ( + sess.query(users), + [ + {'aliased': False, + 'expr': users.c.id, 'type': users.c.id.type, + 'name': 'id', 'entity': None}, + {'aliased': False, + 'expr': users.c.name, 'type': users.c.name.type, + 'name': 'name', 'entity': None} + ] + ), + ( sess.query(users.c.name), [{ "name": "name", "type": users.c.name.type, @@ -2860,44 +2881,143 @@ class TextTest(QueryTest, AssertsCompiledSQL): [User(id=7), User(id=8), User(id=9), User(id=10)] ) - def test_order_by_w_eager(self): + def test_order_by_w_eager_one(self): + User = self.classes.User + s = create_session() + + # from 1.0.0 thru 1.0.2, the "name" symbol here was considered + # to be part of the things we need to ORDER BY and it was being + # placed into the inner query's columns clause, as part of + # query._compound_eager_statement where we add unwrap_order_by() + # to the columns clause. However, as #3392 illustrates, unlocatable + # string expressions like "name desc" will only fail in this scenario, + # so in general the changing of the query structure with string labels + # is dangerous. + # + # the queries here are again "invalid" from a SQL perspective, as the + # "name" field isn't matched up to anything. + # + with expect_warnings("Can't resolve label reference 'name';"): + self.assert_compile( + s.query(User).options(joinedload("addresses")). + order_by(desc("name")).limit(1), + "SELECT anon_1.users_id AS anon_1_users_id, " + "anon_1.users_name AS anon_1_users_name, " + "addresses_1.id AS addresses_1_id, " + "addresses_1.user_id AS addresses_1_user_id, " + "addresses_1.email_address AS addresses_1_email_address " + "FROM (SELECT users.id AS users_id, users.name AS users_name " + "FROM users ORDER BY users.name " + "DESC LIMIT :param_1) AS anon_1 " + "LEFT OUTER JOIN addresses AS addresses_1 " + "ON anon_1.users_id = addresses_1.user_id " + "ORDER BY name DESC, addresses_1.id" + ) + + def test_order_by_w_eager_two(self): + User = self.classes.User + s = create_session() + + with expect_warnings("Can't resolve label reference 'name';"): + self.assert_compile( + s.query(User).options(joinedload("addresses")). + order_by("name").limit(1), + "SELECT anon_1.users_id AS anon_1_users_id, " + "anon_1.users_name AS anon_1_users_name, " + "addresses_1.id AS addresses_1_id, " + "addresses_1.user_id AS addresses_1_user_id, " + "addresses_1.email_address AS addresses_1_email_address " + "FROM (SELECT users.id AS users_id, users.name AS users_name " + "FROM users ORDER BY users.name " + "LIMIT :param_1) AS anon_1 " + "LEFT OUTER JOIN addresses AS addresses_1 " + "ON anon_1.users_id = addresses_1.user_id " + "ORDER BY name, addresses_1.id" + ) + + def test_order_by_w_eager_three(self): + User = self.classes.User + s = create_session() + + self.assert_compile( + s.query(User).options(joinedload("addresses")). + order_by("users_name").limit(1), + "SELECT anon_1.users_id AS anon_1_users_id, " + "anon_1.users_name AS anon_1_users_name, " + "addresses_1.id AS addresses_1_id, " + "addresses_1.user_id AS addresses_1_user_id, " + "addresses_1.email_address AS addresses_1_email_address " + "FROM (SELECT users.id AS users_id, users.name AS users_name " + "FROM users ORDER BY users.name " + "LIMIT :param_1) AS anon_1 " + "LEFT OUTER JOIN addresses AS addresses_1 " + "ON anon_1.users_id = addresses_1.user_id " + "ORDER BY anon_1.users_name, addresses_1.id" + ) + + # however! this works (again?) + eq_( + s.query(User).options(joinedload("addresses")). + order_by("users_name").first(), + User(name='chuck', addresses=[]) + ) + + def test_order_by_w_eager_four(self): User = self.classes.User Address = self.classes.Address s = create_session() - # here, we are seeing how Query has to take the order by expressions - # of the query and then add them to the columns list, so that the - # outer subquery can order by that same label. With the anonymous - # label, our column gets sucked up and restated again in the - # inner columns list! - # we could try to play games with making this "smarter" but it - # would add permanent overhead to Select._columns_plus_names, - # since that's where references would need to be resolved. - # so as it is, this query takes the _label_reference and makes a - # full blown proxy and all the rest of it. self.assert_compile( s.query(User).options(joinedload("addresses")). - order_by(desc("name")).limit(1), + order_by(desc("users_name")).limit(1), "SELECT anon_1.users_id AS anon_1_users_id, " "anon_1.users_name AS anon_1_users_name, " - "anon_1.anon_2 AS anon_1_anon_2, " "addresses_1.id AS addresses_1_id, " "addresses_1.user_id AS addresses_1_user_id, " "addresses_1.email_address AS addresses_1_email_address " - "FROM (SELECT users.id AS users_id, users.name AS users_name, " - "users.name AS anon_2 FROM users ORDER BY users.name " - "DESC LIMIT :param_1) AS anon_1 " + "FROM (SELECT users.id AS users_id, users.name AS users_name " + "FROM users ORDER BY users.name DESC " + "LIMIT :param_1) AS anon_1 " "LEFT OUTER JOIN addresses AS addresses_1 " "ON anon_1.users_id = addresses_1.user_id " - "ORDER BY anon_1.anon_2 DESC, addresses_1.id" + "ORDER BY anon_1.users_name DESC, addresses_1.id" ) + # however! this works (again?) eq_( s.query(User).options(joinedload("addresses")). - order_by(desc("name")).first(), + order_by(desc("users_name")).first(), User(name='jack', addresses=[Address()]) ) + def test_order_by_w_eager_five(self): + """essentially the same as test_eager_relations -> test_limit_3, + but test for textual label elements that are freeform. + this is again #3392.""" + + User = self.classes.User + Address = self.classes.Address + Order = self.classes.Order + + sess = create_session() + + q = sess.query(User, Address.email_address.label('email_address')) + + l = q.join('addresses').options(joinedload(User.orders)).\ + order_by( + "email_address desc").limit(1).offset(0) + with expect_warnings( + "Can't resolve label reference 'email_address desc'"): + eq_( + [ + (User( + id=7, + orders=[Order(id=1), Order(id=3), Order(id=5)], + addresses=[Address(id=1)] + ), 'jack@bean.com') + ], + l.all()) + class TextWarningTest(QueryTest, AssertsCompiledSQL): def _test(self, fn, arg, offending_clause, expected): diff --git a/test/orm/test_unitofwork.py b/test/orm/test_unitofwork.py index ae5a8ef60..5a47903f0 100644 --- a/test/orm/test_unitofwork.py +++ b/test/orm/test_unitofwork.py @@ -7,7 +7,8 @@ from sqlalchemy.orm import mapper as orm_mapper import sqlalchemy as sa from sqlalchemy.util import u, ue, b -from sqlalchemy import Integer, String, ForeignKey, literal_column, event +from sqlalchemy import Integer, String, ForeignKey, \ + literal_column, event, Boolean from sqlalchemy.testing import engines from sqlalchemy import testing from sqlalchemy.testing.schema import Table @@ -18,6 +19,7 @@ from sqlalchemy.testing import fixtures from test.orm import _fixtures from sqlalchemy.testing.assertsql import AllOf, CompiledSQL + class UnitOfWorkTest(object): pass @@ -383,16 +385,26 @@ class ClauseAttributesTest(fixtures.MappedTest): Column('name', String(30)), Column('counter', Integer, default=1)) + Table('boolean_t', metadata, + Column('id', Integer, primary_key=True, + test_needs_autoincrement=True), + Column('value', Boolean), + ) + @classmethod def setup_classes(cls): class User(cls.Comparable): pass + class HasBoolean(cls.Comparable): + pass + @classmethod def setup_mappers(cls): User, users_t = cls.classes.User, cls.tables.users_t - + HasBoolean, boolean_t = cls.classes.HasBoolean, cls.tables.boolean_t mapper(User, users_t) + mapper(HasBoolean, boolean_t) def test_update(self): User = self.classes.User @@ -446,6 +458,30 @@ class ClauseAttributesTest(fixtures.MappedTest): assert (u.counter == 5) is True + def test_update_special_comparator(self): + HasBoolean = self.classes.HasBoolean + + # make sure the comparison we're shooting + # for is invalid, otherwise we need to + # test something else here + assert_raises_message( + TypeError, + "Boolean value of this clause is not defined", + bool, None == sa.false() + ) + s = create_session() + hb = HasBoolean(value=None) + s.add(hb) + s.flush() + + hb.value = sa.false() + + s.flush() + + # needs to be refreshed + assert 'value' not in hb.__dict__ + eq_(hb.value, False) + class PassiveDeletesTest(fixtures.MappedTest): __requires__ = ('foreign_keys',) diff --git a/test/sql/test_compiler.py b/test/sql/test_compiler.py index 03646d78d..04e3171a9 100644 --- a/test/sql/test_compiler.py +++ b/test/sql/test_compiler.py @@ -961,6 +961,19 @@ class SelectTest(fixtures.TestBase, AssertsCompiledSQL): dialect=dialect ) + def test_no_group_by_labels(self): + lab1 = (table1.c.myid + 12).label('foo') + lab2 = func.somefunc(table1.c.name).label('bar') + dialect = default.DefaultDialect() + + self.assert_compile( + select([lab1, lab2]).group_by(lab1, lab2), + "SELECT mytable.myid + :myid_1 AS foo, somefunc(mytable.name) " + "AS bar FROM mytable GROUP BY mytable.myid + :myid_1, " + "somefunc(mytable.name)", + dialect=dialect + ) + def test_conjunctions(self): a, b, c = text('a'), text('b'), text('c') x = and_(a, b, c) diff --git a/test/sql/test_labels.py b/test/sql/test_labels.py index 1792a42d8..7f548eb49 100644 --- a/test/sql/test_labels.py +++ b/test/sql/test_labels.py @@ -138,8 +138,9 @@ class MaxIdentTest(fixtures.TestBase, AssertsCompiledSQL): issuperset(['this_is_the_data_column', s.c.this_is_the_data_column]) assert \ - set(compiled._create_result_map()['this_is_the_primarykey_column'][1]).\ + set(compiled._create_result_map()['this_is_the_primarykey__1'][1]).\ issuperset(['this_is_the_primarykey_column', + 'this_is_the_primarykey__1', s.c.this_is_the_primarykey_column]) def test_result_map_anon_alias(self): @@ -150,29 +151,28 @@ class MaxIdentTest(fixtures.TestBase, AssertsCompiledSQL): s = select([q]).apply_labels() self.assert_compile( - s, 'SELECT ' - 'anon_1.this_is_the_primarykey_column ' - 'AS anon_1_this_is_the_prim_1, ' - 'anon_1.this_is_the_data_column ' - 'AS anon_1_this_is_the_data_2 ' - 'FROM (' - 'SELECT ' - 'some_large_named_table.' - 'this_is_the_primarykey_column ' - 'AS this_is_the_primarykey_column, ' - 'some_large_named_table.this_is_the_data_column ' - 'AS this_is_the_data_column ' - 'FROM ' - 'some_large_named_table ' - 'WHERE ' - 'some_large_named_table.this_is_the_primarykey_column ' - '= :this_is_the_primarykey__1' - ') ' - 'AS anon_1', dialect=dialect) + s, + "SELECT " + "anon_1.this_is_the_primarykey__2 AS anon_1_this_is_the_prim_1, " + "anon_1.this_is_the_data_column AS anon_1_this_is_the_data_3 " + "FROM (" + "SELECT " + "some_large_named_table." + "this_is_the_primarykey_column AS this_is_the_primarykey__2, " + "some_large_named_table." + "this_is_the_data_column AS this_is_the_data_column " + "FROM " + "some_large_named_table " + "WHERE " + "some_large_named_table.this_is_the_primarykey_column " + "= :this_is_the_primarykey__1" + ") " + "AS anon_1", dialect=dialect) + compiled = s.compile(dialect=dialect) - assert set(compiled._create_result_map()['anon_1_this_is_the_data_2'][1]).\ + assert set(compiled._create_result_map()['anon_1_this_is_the_data_3'][1]).\ issuperset([ - 'anon_1_this_is_the_data_2', + 'anon_1_this_is_the_data_3', q.corresponding_column( table1.c.this_is_the_data_column) ]) @@ -542,3 +542,26 @@ class LabelLengthTest(fixtures.TestBase, AssertsCompiledSQL): compiled = s.compile(dialect=dialect) assert set(compiled._create_result_map()['_1'][1]).issuperset([ 'asdf_abcde', a1.c.abcde, '_1']) + + def test_label_overlap_unlabeled(self): + """test that an anon col can't overlap with a fixed name, #3396""" + + table1 = table( + "tablename", column('columnname_one'), column('columnn_1')) + + stmt = select([table1]).apply_labels() + + dialect = default.DefaultDialect(label_length=23) + self.assert_compile( + stmt, + "SELECT tablename.columnname_one AS tablename_columnn_1, " + "tablename.columnn_1 AS tablename_columnn_2 FROM tablename", + dialect=dialect + ) + compiled = stmt.compile(dialect=dialect) + eq_( + set(compiled._create_result_map()), + set(['tablename_columnn_1', 'tablename_columnn_2']) + ) + + |