API Reference

This section contains the complete API reference for YDB SQLAlchemy.

Core Module

Experimental Work in progress, breaking changes are possible.

class ydb_sqlalchemy.sqlalchemy.ParametrizedFunction(name, params, *args, **kwargs)[source]

Bases: Function

Construct a Function.

The func construct is normally used to construct new Function instances.

ydb_sqlalchemy.sqlalchemy.upsert(table)[source]
class ydb_sqlalchemy.sqlalchemy.YdbRequestSettingsCharacteristic[source]

Bases: ConnectionCharacteristic

reset_characteristic(dialect: YqlDialect, dbapi_connection: Connection) None[source]

Reset the characteristic on the DBAPI connection to its default value.

Parameters:
  • dialect (YqlDialect)

  • dbapi_connection (Connection)

Return type:

None

set_characteristic(dialect: YqlDialect, dbapi_connection: Connection, value: BaseRequestSettings) None[source]

set characteristic on the DBAPI connection to a given value.

Parameters:
  • dialect (YqlDialect)

  • dbapi_connection (Connection)

  • value (BaseRequestSettings)

Return type:

None

get_characteristic(dialect: YqlDialect, dbapi_connection: Connection) BaseRequestSettings[source]

Given a DBAPI connection, get the current value of the characteristic.

Parameters:
  • dialect (YqlDialect)

  • dbapi_connection (Connection)

Return type:

BaseRequestSettings

class ydb_sqlalchemy.sqlalchemy.YqlDialect(json_serializer=None, json_deserializer=None, _add_declare_for_yql_stmt_vars=False, **kwargs)[source]

Bases: StrCompileDialect

name: str = 'yql'

identifying name for the dialect from a DBAPI-neutral point of view (i.e. ‘sqlite’)

driver: str = 'ydb'

identifying name for the dialect’s DBAPI

supports_alter: bool = False

True if the database supports ALTER TABLE - used only for generating foreign key constraints in certain circumstances

max_identifier_length: int = 63

The maximum length of identifier names.

supports_sane_rowcount: bool = False

Indicate whether the dialect properly implements rowcount for UPDATE and DELETE statements.

supports_statement_cache: bool = True

indicates if this dialect supports caching.

All dialects that are compatible with statement caching should set this flag to True directly on each dialect class and subclass that supports it. SQLAlchemy tests that this flag is locally present on each dialect subclass before it will use statement caching. This is to provide safety for legacy or new dialects that are not yet fully tested to be compliant with SQL statement caching.

Added in version 1.4.5.

supports_native_enum: bool = False

Indicates if the dialect supports a native ENUM construct. This will prevent _types.Enum from generating a CHECK constraint when that type is used in “native” mode.

supports_native_boolean: bool = True

Indicates if the dialect supports a native boolean construct. This will prevent _types.Boolean from generating a CHECK constraint when that type is used.

supports_native_decimal: bool = True

indicates if Decimal objects are handled and returned for precision numeric types, or if floats are returned

supports_smallserial = False
supports_schemas = False
supports_constraint_comments: bool = False

Indicates if the dialect supports comment DDL on constraints.

Added in version 2.0.

supports_json_type = True
insert_returning: bool = False

if the dialect supports RETURNING with INSERT

Added in version 2.0.

update_returning: bool = False

if the dialect supports RETURNING with UPDATE

Added in version 2.0.

delete_returning: bool = False

if the dialect supports RETURNING with DELETE

Added in version 2.0.

supports_sequences: bool = False

Indicates if the dialect supports CREATE SEQUENCE or similar.

sequences_optional: bool = False

If True, indicates if the :paramref:`_schema.Sequence.optional` parameter on the _schema.Sequence construct should signal to not generate a CREATE SEQUENCE. Applies only to dialects that support sequences. Currently used only to allow PostgreSQL SERIAL to be used on a column that specifies Sequence() for usage on other backends.

preexecute_autoincrement_sequences: bool = True

True if ‘implicit’ primary key functions must be executed separately in order to get their value, if RETURNING is not used.

This is currently oriented towards PostgreSQL when the implicit_returning=False parameter is used on a Table object.

postfetch_lastrowid = False
supports_default_values: bool = False

dialect supports INSERT… DEFAULT VALUES syntax

supports_empty_insert: bool = False

dialect supports INSERT () VALUES ()

supports_multivalues_insert: bool = True

Target database supports INSERT…VALUES with multiple value sets, i.e. INSERT INTO table (cols) VALUES (…), (…), (…), …

default_paramstyle = 'qmark'
isolation_level: str | None = None
preparer

alias of YqlIdentifierPreparer

statement_compiler

alias of YqlCompiler

ddl_compiler

alias of YqlDDLCompiler

type_compiler

alias of YqlTypeCompiler

colspecs: MutableMapping[Type[TypeEngine[Any]], Type[TypeEngine[Any]]] = {<class 'sqlalchemy.sql.sqltypes.DATETIME'>: <class 'ydb_sqlalchemy.sqlalchemy.datetime_types.YqlDateTime'>, <class 'sqlalchemy.sql.sqltypes.DECIMAL'>: <class 'ydb_sqlalchemy.sqlalchemy.types.Decimal'>, <class 'sqlalchemy.sql.sqltypes.Date'>: <class 'ydb_sqlalchemy.sqlalchemy.datetime_types.YqlDate'>, <class 'sqlalchemy.sql.sqltypes.DateTime'>: <class 'ydb_sqlalchemy.sqlalchemy.datetime_types.YqlTimestamp'>, <class 'sqlalchemy.sql.sqltypes.JSON'>: <class 'ydb_sqlalchemy.sqlalchemy.json.YqlJSON'>, <class 'sqlalchemy.sql.sqltypes.JSON.JSONPathType'>: <class 'ydb_sqlalchemy.sqlalchemy.json.YqlJSON.YqlJSONPathType'>, <class 'sqlalchemy.sql.sqltypes.TIMESTAMP'>: <class 'ydb_sqlalchemy.sqlalchemy.datetime_types.YqlTimestamp'>}

A dictionary of TypeEngine classes from sqlalchemy.types mapped to subclasses that are specific to the dialect class. This dictionary is class-level only and is not accessed from the dialect instance itself.

connection_characteristics = {'isolation_level': <sqlalchemy.engine.characteristics.IsolationLevelCharacteristic object>, 'ydb_request_settings': <ydb_sqlalchemy.sqlalchemy.YdbRequestSettingsCharacteristic object>}
construct_arguments: List[Tuple[Type[SchemaItem | ClauseElement], Mapping[str, Any]]] | None = [(<class 'sqlalchemy.sql.schema.Table'>, {'auto_partitioning_by_load': None, 'auto_partitioning_by_size': None, 'auto_partitioning_max_partitions_count': None, 'auto_partitioning_min_partitions_count': None, 'auto_partitioning_partition_size_mb': None, 'partition_at_keys': None, 'uniform_partitions': None}), (<class 'sqlalchemy.sql.schema.Index'>, {'async': False, 'cover': []})]

Optional set of argument specifiers for various SQLAlchemy constructs, typically schema items.

To implement, establish as a series of tuples, as in:

construct_arguments = [
    (schema.Index, {"using": False, "where": None, "ops": None}),
]

If the above construct is established on the PostgreSQL dialect, the Index construct will now accept the keyword arguments postgresql_using, postgresql_where, nad postgresql_ops. Any other argument specified to the constructor of Index which is prefixed with postgresql_ will raise ArgumentError.

A dialect which does not include a construct_arguments member will not participate in the argument validation system. For such a dialect, any argument name is accepted by all participating constructs, within the namespace of arguments prefixed with that dialect name. The rationale here is so that third-party dialects that haven’t yet implemented this feature continue to function in the old way.

See also

DialectKWArgs - implementing base class which consumes DefaultDialect.construct_arguments

classmethod import_dbapi()[source]

Import the DBAPI module that is used by this dialect.

The Python module object returned here will be assigned as an instance variable to a constructed dialect under the name .dbapi.

Changed in version 2.0: The Dialect.import_dbapi() class method is renamed from the previous method .Dialect.dbapi(), which would be replaced at dialect instantiation time by the DBAPI module itself, thus using the same name in two different ways. If a .Dialect.dbapi() classmethod is present on a third-party dialect, it will be used and a deprecation warning will be emitted.

classmethod dbapi()[source]
get_view_names(connection, schema=None, **kw: Any)[source]

Return a list of all non-materialized view names available in the database.

This is an internal dialect method. Applications should use _engine.Inspector.get_view_names().

Parameters:
  • schema – schema name to query, if not the default schema.

  • kw (Any)

get_columns(connection, table_name, schema=None, **kw)[source]

Return information about columns in table_name.

Given a _engine.Connection, a string table_name, and an optional string schema, return column information as a list of dictionaries corresponding to the ReflectedColumn dictionary.

This is an internal dialect method. Applications should use Inspector.get_columns().

get_table_names(connection, schema=None, **kw)[source]

Return a list of table names for schema.

This is an internal dialect method. Applications should use _engine.Inspector.get_table_names().

has_table(connection, table_name, schema=None, **kwargs)[source]

For internal dialect use, check the existence of a particular table or view in the database.

Given a _engine.Connection object, a string table_name and optional schema name, return True if the given table exists in the database, False otherwise.

This method serves as the underlying implementation of the public facing Inspector.has_table() method, and is also used internally to implement the “checkfirst” behavior for methods like _schema.Table.create() and _schema.MetaData.create_all().

Note

This method is used internally by SQLAlchemy, and is published so that third-party dialects may provide an implementation. It is not the public API for checking for table presence. Please use the Inspector.has_table() method.

Changed in version 2.0::: _engine.Dialect.has_table() now formally supports checking for additional table-like objects:

  • any type of views (plain or materialized)

  • temporary tables of any kind

Previously, these two checks were not formally specified and different dialects would vary in their behavior. The dialect testing suite now includes tests for all of these object types, and dialects to the degree that the backing database supports views or temporary tables should seek to support locating these objects for full compliance.

get_pk_constraint(connection, table_name, schema=None, **kwargs)[source]

Return information about the primary key constraint on table_name`.

Given a _engine.Connection, a string table_name, and an optional string schema, return primary key information as a dictionary corresponding to the ReflectedPrimaryKeyConstraint dictionary.

This is an internal dialect method. Applications should use Inspector.get_pk_constraint().

get_foreign_keys(connection, table_name, schema=None, **kwargs)[source]

Return information about foreign_keys in table_name.

Given a _engine.Connection, a string table_name, and an optional string schema, return foreign key information as a list of dicts corresponding to the ReflectedForeignKeyConstraint dictionary.

This is an internal dialect method. Applications should use _engine.Inspector.get_foreign_keys().

get_indexes(connection, table_name, schema=None, **kwargs)[source]

Return information about indexes in table_name.

Given a _engine.Connection, a string table_name and an optional string schema, return index information as a list of dictionaries corresponding to the ReflectedIndex dictionary.

This is an internal dialect method. Applications should use Inspector.get_indexes().

set_isolation_level(dbapi_connection: Connection, level: str) None[source]

Given a DBAPI connection, set its isolation level.

Note that this is a dialect-level method which is used as part of the implementation of the _engine.Connection and _engine.Engine isolation level facilities; these APIs should be preferred for most typical use cases.

If the dialect also implements the Dialect.get_isolation_level_values() method, then the given level is guaranteed to be one of the string names within that sequence, and the method will not need to anticipate a lookup failure.

See also

_engine.Connection.get_isolation_level() - view current level

_engine.Connection.default_isolation_level - view default level

:paramref:`.Connection.execution_options.isolation_level` - set per _engine.Connection isolation level

:paramref:`_sa.create_engine.isolation_level` - set per _engine.Engine isolation level

Parameters:
  • dbapi_connection (Connection)

  • level (str)

Return type:

None

get_default_isolation_level(dbapi_conn: Connection) str[source]

Given a DBAPI connection, return its isolation level, or a default isolation level if one cannot be retrieved.

May be overridden by subclasses in order to provide a “fallback” isolation level for databases that cannot reliably retrieve the actual isolation level.

By default, calls the _engine.Interfaces.get_isolation_level() method, propagating any exceptions raised.

Added in version 1.3.22.

Parameters:

dbapi_conn (Connection)

Return type:

str

get_isolation_level(dbapi_connection: Connection) str[source]

Given a DBAPI connection, return its isolation level.

When working with a _engine.Connection object, the corresponding DBAPI connection may be procured using the _engine.Connection.connection accessor.

Note that this is a dialect-level method which is used as part of the implementation of the _engine.Connection and _engine.Engine isolation level facilities; these APIs should be preferred for most typical use cases.

See also

_engine.Connection.get_isolation_level() - view current level

_engine.Connection.default_isolation_level - view default level

:paramref:`.Connection.execution_options.isolation_level` - set per _engine.Connection isolation level

:paramref:`_sa.create_engine.isolation_level` - set per _engine.Engine isolation level

Parameters:

dbapi_connection (Connection)

Return type:

str

set_ydb_request_settings(dbapi_connection: Connection, value: BaseRequestSettings) None[source]
Parameters:
  • dbapi_connection (Connection)

  • value (BaseRequestSettings)

Return type:

None

reset_ydb_request_settings(dbapi_connection: Connection)[source]
Parameters:

dbapi_connection (Connection)

get_ydb_request_settings(dbapi_connection: Connection) BaseRequestSettings[source]
Parameters:

dbapi_connection (Connection)

Return type:

BaseRequestSettings

create_connect_args(url)[source]

Build DB-API compatible connection arguments.

Given a URL object, returns a tuple consisting of a (*args, **kwargs) suitable to send directly to the dbapi’s connect function. The arguments are sent to the Dialect.connect() method which then runs the DBAPI-level connect() function.

The method typically makes use of the URL.translate_connect_args() method in order to generate a dictionary of options.

The default implementation is:

def create_connect_args(self, url):
    opts = url.translate_connect_args()
    opts.update(url.query)
    return ([], opts)
Parameters:

url – a URL object

Returns:

a tuple of (*args, **kwargs) which will be passed to the Dialect.connect() method.

See also

URL.translate_connect_args()

connect(*cargs, **cparams)[source]

Establish a connection using this dialect’s DBAPI.

The default implementation of this method is:

def connect(self, *cargs, **cparams):
    return self.dbapi.connect(*cargs, **cparams)

The *cargs, **cparams parameters are generated directly from this dialect’s Dialect.create_connect_args() method.

This method may be used for dialects that need to perform programmatic per-connection steps when a new connection is procured from the DBAPI.

Parameters:
  • *cargs – positional parameters returned from the Dialect.create_connect_args() method

  • **cparams – keyword parameters returned from the Dialect.create_connect_args() method.

Returns:

a DBAPI connection, typically from the PEP 249 module level .connect() function.

See also

Dialect.create_connect_args()

Dialect.on_connect()

do_begin(dbapi_connection: Connection) None[source]

Provide an implementation of connection.begin(), given a DB-API connection.

The DBAPI has no dedicated “begin” method and it is expected that transactions are implicit. This hook is provided for those DBAPIs that might need additional help in this area.

Parameters:

dbapi_connection (Connection) – a DBAPI connection, typically proxied within a ConnectionFairy.

Return type:

None

do_rollback(dbapi_connection: Connection) None[source]

Provide an implementation of connection.rollback(), given a DB-API connection.

Parameters:

dbapi_connection (Connection) – a DBAPI connection, typically proxied within a ConnectionFairy.

Return type:

None

do_commit(dbapi_connection: Connection) None[source]

Provide an implementation of connection.commit(), given a DB-API connection.

Parameters:

dbapi_connection (Connection) – a DBAPI connection, typically proxied within a ConnectionFairy.

Return type:

None

do_ping(dbapi_connection: Connection) bool[source]

ping the DBAPI connection and return True if the connection is usable.

Parameters:

dbapi_connection (Connection)

Return type:

bool

do_executemany(cursor: Cursor, statement: str, parameters: Sequence[Mapping[str, Any]] | None, context: DefaultExecutionContext | None = None) None[source]

Provide an implementation of cursor.executemany(statement, parameters).

Parameters:
Return type:

None

do_execute(cursor: Cursor, statement: str, parameters: Mapping[str, Any] | None = None, context: DefaultExecutionContext | None = None) None[source]

Provide an implementation of cursor.execute(statement, parameters).

Parameters:
Return type:

None

class ydb_sqlalchemy.sqlalchemy.AsyncYqlDialect(json_serializer=None, json_deserializer=None, _add_declare_for_yql_stmt_vars=False, **kwargs)[source]

Bases: YqlDialect

driver: str = 'ydb_async'

identifying name for the dialect’s DBAPI

is_async: bool = True

Whether or not this dialect is intended for asyncio use.

supports_statement_cache: bool = True

indicates if this dialect supports caching.

All dialects that are compatible with statement caching should set this flag to True directly on each dialect class and subclass that supports it. SQLAlchemy tests that this flag is locally present on each dialect subclass before it will use statement caching. This is to provide safety for legacy or new dialects that are not yet fully tested to be compliant with SQL statement caching.

Added in version 1.4.5.

connect(*cargs, **cparams)[source]

Establish a connection using this dialect’s DBAPI.

The default implementation of this method is:

def connect(self, *cargs, **cparams):
    return self.dbapi.connect(*cargs, **cparams)

The *cargs, **cparams parameters are generated directly from this dialect’s Dialect.create_connect_args() method.

This method may be used for dialects that need to perform programmatic per-connection steps when a new connection is procured from the DBAPI.

Parameters:
  • *cargs – positional parameters returned from the Dialect.create_connect_args() method

  • **cparams – keyword parameters returned from the Dialect.create_connect_args() method.

Returns:

a DBAPI connection, typically from the PEP 249 module level .connect() function.

See also

Dialect.create_connect_args()

Dialect.on_connect()

Types Module

class ydb_sqlalchemy.sqlalchemy.types.UInt64[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.UInt32[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.UInt16[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.UInt8[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.Int64[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.Int32[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.Int16[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.Int8[source]

Bases: Integer

class ydb_sqlalchemy.sqlalchemy.types.Decimal(precision=None, scale=None, asdecimal=True)[source]

Bases: DECIMAL

Construct a Numeric.

Parameters:
  • precision – the numeric precision for use in DDL CREATE TABLE.

  • scale – the numeric scale for use in DDL CREATE TABLE.

  • asdecimal – default True. Return whether or not values should be sent as Python Decimal objects, or as floats. Different DBAPIs send one or the other based on datatypes - the Numeric type will ensure that return values are one or the other across DBAPIs consistently.

  • decimal_return_scale – Default scale to use when converting from floats to Python decimals. Floating point values will typically be much longer due to decimal inaccuracy, and most floating point database types don’t have a notion of “scale”, so by default the float type looks for the first ten decimal places when converting. Specifying this value will override that length. Types which do include an explicit “.scale” value, such as the base Numeric as well as the MySQL float types, will use the value of “.scale” as the default for decimal_return_scale, if not otherwise specified.

When using the Numeric type, care should be taken to ensure that the asdecimal setting is appropriate for the DBAPI in use - when Numeric applies a conversion from Decimal->float or float-> Decimal, this conversion incurs an additional performance overhead for all result columns received.

DBAPIs that return Decimal natively (e.g. psycopg2) will have better accuracy and higher performance with a setting of True, as the native translation to Decimal reduces the amount of floating- point issues at play, and the Numeric type itself doesn’t need to apply any further conversions. However, another DBAPI which returns floats natively will incur an additional conversion overhead, and is still subject to floating point data loss - in which case asdecimal=False will at least remove the extra conversion overhead.

bind_processor(dialect)[source]

Return a conversion function for processing bind values.

Returns a callable which will receive a bind parameter value as the sole positional argument and will return a value to send to the DB-API.

If processing is not necessary, the method should return None.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.bind_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.bind_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_bind_param().

Parameters:

dialect – Dialect instance in use.

result_processor(dialect, coltype)[source]

Return a conversion function for processing result row values.

Returns a callable which will receive a result row column value as the sole positional argument and will return a value to return to the user.

If processing is not necessary, the method should return None.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.result_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.result_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_result_value().

Parameters:
  • dialect – Dialect instance in use.

  • coltype – DBAPI coltype argument received in cursor.description.

literal_processor(dialect)[source]

Return a conversion function for processing literal values that are to be rendered directly without using binds.

This function is used when the compiler makes use of the “literal_binds” flag, typically used in DDL generation as well as in certain scenarios where backends don’t accept bound parameters.

Returns a callable which will receive a literal Python value as the sole positional argument and will return a string representation to be rendered in a SQL statement.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.literal_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.literal_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_literal_param().

class ydb_sqlalchemy.sqlalchemy.types.ListType(item_type: _TypeEngineArgument[_T], as_tuple: bool = False, dimensions: int | None = None, zero_indexes: bool = False)[source]

Bases: ARRAY

Construct an _types.ARRAY.

E.g.:

Column("myarray", ARRAY(Integer))

Arguments are:

Parameters:
  • item_type (_TypeEngineArgument[_T]) – The data type of items of this array. Note that dimensionality is irrelevant here, so multi-dimensional arrays like INTEGER[][], are constructed as ARRAY(Integer), not as ARRAY(ARRAY(Integer)) or such.

  • as_tuple=False – Specify whether return results should be converted to tuples from lists. This parameter is not generally needed as a Python list corresponds well to a SQL array.

  • dimensions (Optional[int]) – if non-None, the ARRAY will assume a fixed number of dimensions. This impacts how the array is declared on the database, how it goes about interpreting Python and result values, as well as how expression behavior in conjunction with the “getitem” operator works. See the description at _types.ARRAY for additional detail.

  • zero_indexes=False – when True, index values will be converted between Python zero-based and SQL one-based indexes, e.g. a value of one will be added to all index values before passing to the database.

  • as_tuple (bool)

  • zero_indexes (bool)

class ydb_sqlalchemy.sqlalchemy.types.HashableDict[source]

Bases: dict

class ydb_sqlalchemy.sqlalchemy.types.StructType(fields_types: Mapping[str, Type[TypeEngine] | Type[TypeDecorator]])[source]

Bases: TypeEngine

Parameters:

fields_types (Mapping[str, Type[TypeEngine] | Type[TypeDecorator]])

property python_type

Return the Python type object expected to be returned by instances of this type, if known.

Basically, for those types which enforce a return type, or are known across the board to do such for all common DBAPIs (like int for example), will return that type.

If a return type is not defined, raises NotImplementedError.

Note that any type also accommodates NULL in SQL which means you can also get back None from any type in practice.

compare_values(x, y)[source]

Compare two values for equality.

class ydb_sqlalchemy.sqlalchemy.types.Lambda(func)[source]

Bases: ColumnElement

DateTime Types

class ydb_sqlalchemy.sqlalchemy.datetime_types.YqlDate[source]

Bases: Date

literal_processor(dialect)[source]

Return a conversion function for processing literal values that are to be rendered directly without using binds.

This function is used when the compiler makes use of the “literal_binds” flag, typically used in DDL generation as well as in certain scenarios where backends don’t accept bound parameters.

Returns a callable which will receive a literal Python value as the sole positional argument and will return a string representation to be rendered in a SQL statement.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.literal_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.literal_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_literal_param().

class ydb_sqlalchemy.sqlalchemy.datetime_types.YqlTimestamp(timezone: bool = False)[source]

Bases: TIMESTAMP

Construct a new _types.TIMESTAMP.

Parameters:

timezone (bool) – boolean. Indicates that the TIMESTAMP type should enable timezone support, if available on the target database. On a per-dialect basis is similar to “TIMESTAMP WITH TIMEZONE”. If the target database does not support timezones, this flag is ignored.

result_processor(dialect, coltype)[source]

Return a conversion function for processing result row values.

Returns a callable which will receive a result row column value as the sole positional argument and will return a value to return to the user.

If processing is not necessary, the method should return None.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.result_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.result_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_result_value().

Parameters:
  • dialect – Dialect instance in use.

  • coltype – DBAPI coltype argument received in cursor.description.

class ydb_sqlalchemy.sqlalchemy.datetime_types.YqlDateTime(timezone: bool = False)[source]

Bases: YqlTimestamp, DATETIME

Construct a new _types.TIMESTAMP.

Parameters:

timezone (bool) – boolean. Indicates that the TIMESTAMP type should enable timezone support, if available on the target database. On a per-dialect basis is similar to “TIMESTAMP WITH TIMEZONE”. If the target database does not support timezones, this flag is ignored.

bind_processor(dialect)[source]

Return a conversion function for processing bind values.

Returns a callable which will receive a bind parameter value as the sole positional argument and will return a value to send to the DB-API.

If processing is not necessary, the method should return None.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.bind_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.bind_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_bind_param().

Parameters:

dialect – Dialect instance in use.

JSON Types

class ydb_sqlalchemy.sqlalchemy.json.YqlJSON(none_as_null: bool = False)[source]

Bases: JSON

Construct a _types.JSON type.

Parameters:
  • none_as_null=False

    if True, persist the value None as a SQL NULL value, not the JSON encoding of null. Note that when this flag is False, the null() construct can still be used to persist a NULL value, which may be passed directly as a parameter value that is specially interpreted by the _types.JSON type as SQL NULL:

    from sqlalchemy import null
    
    conn.execute(table.insert(), {"data": null()})
    

    Note

    :paramref:`_types.JSON.none_as_null` does not apply to the values passed to :paramref:`_schema.Column.default` and :paramref:`_schema.Column.server_default`; a value of None passed for these parameters means “no default present”.

    Additionally, when used in SQL comparison expressions, the Python value None continues to refer to SQL null, and not JSON NULL. The :paramref:`_types.JSON.none_as_null` flag refers explicitly to the persistence of the value within an INSERT or UPDATE statement. The _types.JSON.NULL value should be used for SQL expressions that wish to compare to JSON null.

    See also

    types.JSON.NULL

  • none_as_null (bool)

class YqlJSONPathType[source]

Bases: JSONPathType

bind_processor(dialect)[source]

Return a conversion function for processing bind values.

Returns a callable which will receive a bind parameter value as the sole positional argument and will return a value to send to the DB-API.

If processing is not necessary, the method should return None.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.bind_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.bind_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_bind_param().

Parameters:

dialect – Dialect instance in use.

literal_processor(dialect)[source]

Return a conversion function for processing literal values that are to be rendered directly without using binds.

This function is used when the compiler makes use of the “literal_binds” flag, typically used in DDL generation as well as in certain scenarios where backends don’t accept bound parameters.

Returns a callable which will receive a literal Python value as the sole positional argument and will return a string representation to be rendered in a SQL statement.

Tip

This method is only called relative to a dialect specific type object, which is often private to a dialect in use and is not the same type object as the public facing one, which means it’s not feasible to subclass a types.TypeEngine class in order to provide an alternate _types.TypeEngine.literal_processor() method, unless subclassing the _types.UserDefinedType class explicitly.

To provide alternate behavior for _types.TypeEngine.literal_processor(), implement a _types.TypeDecorator class and provide an implementation of _types.TypeDecorator.process_literal_param().

Compiler Module

DML Operations

class ydb_sqlalchemy.sqlalchemy.dml.Upsert(table: _DMLTableArgument)[source]

Bases: Insert

Parameters:

table (_DMLTableArgument)

stringify_dialect = 'yql'
inherit_cache: bool | None = False

Indicate if this HasCacheKey instance should make use of the cache key generation scheme used by its immediate superclass.

The attribute defaults to None, which indicates that a construct has not yet taken into account whether or not its appropriate for it to participate in caching; this is functionally equivalent to setting the value to False, except that a warning is also emitted.

This flag can be set to True on a particular class, if the SQL that corresponds to the object does not change based on attributes which are local to this class, and not its superclass.

See also

Enabling Caching Support for Custom Constructs - General guideslines for setting the HasCacheKey.inherit_cache attribute for third-party or user defined SQL constructs.

class ydb_sqlalchemy.sqlalchemy.dml.UpsertDMLState(statement: Insert, compiler: SQLCompiler, disable_implicit_returning: bool = False, **kw: Any)[source]

Bases: InsertDMLState

Parameters:
  • statement (UpdateBase)

  • compiler (SQLCompiler)

  • disable_implicit_returning (bool)

  • kw (Any)

statement: UpdateBase