L ildZddlmZddlZddlZddlZddlmZddlmZddlm Z ddlm Z ddlm Z dd lm Z dd lm Z dd lmZd d lmZd dlmZd dlmZddlmZddlmZddlmZddlmZddlmZe r`_, which has the advantage that additional driver-level arguments can be passed including options such as "read only". The Python sqlite3 driver supports this mode under modern Python 3 versions. The SQLAlchemy pysqlite driver supports this mode of use by specifying "uri=true" in the URL query string. The SQLite-level "URI" is kept as the "database" portion of the SQLAlchemy url (that is, following a slash):: e = create_engine("sqlite:///file:path/to/database?mode=ro&uri=true") .. note:: The "uri=true" parameter must appear in the **query string** of the URL. It will not currently work as expected if it is only present in the :paramref:`_sa.create_engine.connect_args` parameter dictionary. The logic reconciles the simultaneous presence of SQLAlchemy's query string and SQLite's query string by separating out the parameters that belong to the Python sqlite3 driver vs. those that belong to the SQLite URI. This is achieved through the use of a fixed list of parameters known to be accepted by the Python side of the driver. For example, to include a URL that indicates the Python sqlite3 "timeout" and "check_same_thread" parameters, along with the SQLite "mode" and "nolock" parameters, they can all be passed together on the query string:: e = create_engine( "sqlite:///file:path/to/database?" "check_same_thread=true&timeout=10&mode=ro&nolock=1&uri=true" ) Above, the pysqlite / sqlite3 DBAPI would be passed arguments as:: sqlite3.connect( "file:path/to/database?mode=ro&nolock=1", check_same_thread=True, timeout=10, uri=True, ) Regarding future parameters added to either the Python or native drivers. new parameter names added to the SQLite URI scheme should be automatically accommodated by this scheme. New parameter names added to the Python driver side can be accommodated by specifying them in the :paramref:`_sa.create_engine.connect_args` dictionary, until dialect support is added by SQLAlchemy. For the less likely case that the native SQLite driver adds a new parameter name that overlaps with one of the existing, known Python driver parameters (such as "timeout" perhaps), SQLAlchemy's dialect would require adjustment for the URL scheme to continue to support this. As is always the case for all SQLAlchemy dialects, the entire "URL" process can be bypassed in :func:`_sa.create_engine` through the use of the :paramref:`_sa.create_engine.creator` parameter which allows for a custom callable that creates a Python sqlite3 driver level connection directly. .. versionadded:: 1.3.9 .. seealso:: `Uniform Resource Identifiers `_ - in the SQLite documentation .. _pysqlite_regexp: Regular Expression Support --------------------------- .. versionadded:: 1.4 Support for the :meth:`_sql.ColumnOperators.regexp_match` operator is provided using Python's re.search_ function. SQLite itself does not include a working regular expression operator; instead, it includes a non-implemented placeholder operator ``REGEXP`` that calls a user-defined function that must be provided. SQLAlchemy's implementation makes use of the pysqlite create_function_ hook as follows:: def regexp(a, b): return re.search(a, b) is not None sqlite_connection.create_function( "regexp", 2, regexp, ) There is currently no support for regular expression flags as a separate argument, as these are not supported by SQLite's REGEXP operator, however these may be included inline within the regular expression string. See `Python regular expressions`_ for details. .. seealso:: `Python regular expressions`_: Documentation for Python's regular expression syntax. .. _create_function: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.create_function .. _re.search: https://docs.python.org/3/library/re.html#re.search .. _Python regular expressions: https://docs.python.org/3/library/re.html#re.search Compatibility with sqlite3 "native" date and datetime types ----------------------------------------------------------- The pysqlite driver includes the sqlite3.PARSE_DECLTYPES and sqlite3.PARSE_COLNAMES options, which have the effect of any column or expression explicitly cast as "date" or "timestamp" will be converted to a Python date or datetime object. The date and datetime types provided with the pysqlite dialect are not currently compatible with these options, since they render the ISO date/datetime including microseconds, which pysqlite's driver does not. Additionally, SQLAlchemy does not at this time automatically render the "cast" syntax required for the freestanding functions "current_timestamp" and "current_date" to return datetime/date types natively. Unfortunately, pysqlite does not provide the standard DBAPI types in ``cursor.description``, leaving SQLAlchemy with no way to detect these types on the fly without expensive per-row type checks. Keeping in mind that pysqlite's parsing option is not recommended, nor should be necessary, for use with SQLAlchemy, usage of PARSE_DECLTYPES can be forced if one configures "native_datetime=True" on create_engine():: engine = create_engine( "sqlite://", connect_args={ "detect_types": sqlite3.PARSE_DECLTYPES | sqlite3.PARSE_COLNAMES }, native_datetime=True, ) With this flag enabled, the DATE and TIMESTAMP types (but note - not the DATETIME or TIME types...confused yet ?) will not perform any bind parameter or result processing. Execution of "func.current_date()" will return a string. "func.current_timestamp()" is registered as returning a DATETIME type in SQLAlchemy, so this function still receives SQLAlchemy-level result processing. .. _pysqlite_threading_pooling: Threading/Pooling Behavior --------------------------- The ``sqlite3`` DBAPI by default prohibits the use of a particular connection in a thread which is not the one in which it was created. As SQLite has matured, it's behavior under multiple threads has improved, and even includes options for memory only databases to be used in multiple threads. The thread prohibition is known as "check same thread" and may be controlled using the ``sqlite3`` parameter ``check_same_thread``, which will disable or enable this check. SQLAlchemy's default behavior here is to set ``check_same_thread`` to ``False`` automatically whenever a file-based database is in use, to establish compatibility with the default pool class :class:`.QueuePool`. The SQLAlchemy ``pysqlite`` DBAPI establishes the connection pool differently based on the kind of SQLite database that's requested: * When a ``:memory:`` SQLite database is specified, the dialect by default will use :class:`.SingletonThreadPool`. This pool maintains a single connection per thread, so that all access to the engine within the current thread use the same ``:memory:`` database - other threads would access a different ``:memory:`` database. The ``check_same_thread`` parameter defaults to ``True``. * When a file-based database is specified, the dialect will use :class:`.QueuePool` as the source of connections. at the same time, the ``check_same_thread`` flag is set to False by default unless overridden. .. versionchanged:: 2.0 SQLite file database engines now use :class:`.QueuePool` by default. Previously, :class:`.NullPool` were used. The :class:`.NullPool` class may be used by specifying it via the :paramref:`_sa.create_engine.poolclass` parameter. Disabling Connection Pooling for File Databases ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Pooling may be disabled for a file based database by specifying the :class:`.NullPool` implementation for the :func:`_sa.create_engine.poolclass` parameter:: from sqlalchemy import NullPool engine = create_engine("sqlite:///myfile.db", poolclass=NullPool) It's been observed that the :class:`.NullPool` implementation incurs an extremely small performance overhead for repeated checkouts due to the lack of connection re-use implemented by :class:`.QueuePool`. However, it still may be beneficial to use this class if the application is experiencing issues with files being locked. Using a Memory Database in Multiple Threads ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ To use a ``:memory:`` database in a multithreaded scenario, the same connection object must be shared among threads, since the database exists only within the scope of that connection. The :class:`.StaticPool` implementation will maintain a single connection globally, and the ``check_same_thread`` flag can be passed to Pysqlite as ``False``:: from sqlalchemy.pool import StaticPool engine = create_engine( "sqlite://", connect_args={"check_same_thread": False}, poolclass=StaticPool, ) Note that using a ``:memory:`` database in multiple threads requires a recent version of SQLite. Using Temporary Tables with SQLite ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ Due to the way SQLite deals with temporary tables, if you wish to use a temporary table in a file-based SQLite database across multiple checkouts from the connection pool, such as when using an ORM :class:`.Session` where the temporary table should continue to remain after :meth:`.Session.commit` or :meth:`.Session.rollback` is called, a pool which maintains a single connection must be used. Use :class:`.SingletonThreadPool` if the scope is only needed within the current thread, or :class:`.StaticPool` is scope is needed within multiple threads for this case:: # maintain the same connection per thread from sqlalchemy.pool import SingletonThreadPool engine = create_engine("sqlite:///mydb.db", poolclass=SingletonThreadPool) # maintain the same connection across all threads from sqlalchemy.pool import StaticPool engine = create_engine("sqlite:///mydb.db", poolclass=StaticPool) Note that :class:`.SingletonThreadPool` should be configured for the number of threads that are to be used; beyond that number, connections will be closed out in a non deterministic way. Dealing with Mixed String / Binary Columns ------------------------------------------------------ The SQLite database is weakly typed, and as such it is possible when using binary values, which in Python are represented as ``b'some string'``, that a particular SQLite database can have data values within different rows where some of them will be returned as a ``b''`` value by the Pysqlite driver, and others will be returned as Python strings, e.g. ``''`` values. This situation is not known to occur if the SQLAlchemy :class:`.LargeBinary` datatype is used consistently, however if a particular SQLite database has data that was inserted using the Pysqlite driver directly, or when using the SQLAlchemy :class:`.String` type which was later changed to :class:`.LargeBinary`, the table will not be consistently readable because SQLAlchemy's :class:`.LargeBinary` datatype does not handle strings so it has no way of "encoding" a value that is in string format. To deal with a SQLite table that has mixed string / binary data in the same column, use a custom type that will check each row individually:: from sqlalchemy import String from sqlalchemy import TypeDecorator class MixedBinary(TypeDecorator): impl = String cache_ok = True def process_result_value(self, value, dialect): if isinstance(value, str): value = bytes(value, "utf-8") elif value is not None: value = bytes(value) return value Then use the above ``MixedBinary`` datatype in the place where :class:`.LargeBinary` would normally be used. .. _pysqlite_serializable: Serializable isolation / Savepoints / Transactional DDL ------------------------------------------------------- A newly revised version of this important section is now available at the top level of the SQLAlchemy SQLite documentation, in the section :ref:`sqlite_transactions`. .. _pysqlite_udfs: User-Defined Functions ---------------------- pysqlite supports a `create_function() `_ method that allows us to create our own user-defined functions (UDFs) in Python and use them directly in SQLite queries. These functions are registered with a specific DBAPI Connection. SQLAlchemy uses connection pooling with file-based SQLite databases, so we need to ensure that the UDF is attached to the connection when it is created. That is accomplished with an event listener:: from sqlalchemy import create_engine from sqlalchemy import event from sqlalchemy import text def udf(): return "udf-ok" engine = create_engine("sqlite:///./db_file") @event.listens_for(engine, "connect") def connect(conn, rec): conn.create_function("udf", 0, udf) for i in range(5): with engine.connect() as conn: print(conn.scalar(text("SELECT UDF()"))) ) annotationsN)Any)Callable)cast)Optional)Pattern) TYPE_CHECKING)TypeVar)Union)DATE)DATETIME) SQLiteDialect)exc)pool)types)util)Self)ConnectArgsType)DBAPIConnection) DBAPICursor) DBAPIModule)IsolationLevel)VersionInfoType)URL)PoolProxiedConnection)_BindProcessorType)_ResultProcessorTypec0eZdZ ddZ ddZy)_SQLite_pysqliteTimeStampcH|jrytj||SN)native_datetimerbind_processorselfdialects i/mnt/ssd/data/python-lab/Trading/venv/lib/python3.12/site-packages/sqlalchemy/dialects/sqlite/pysqlite.pyr%z(_SQLite_pysqliteTimeStamp.bind_processors#  " "**49 9cJ|jrytj|||Sr#)r$rresult_processorr'r(coltypes r)r,z*_SQLite_pysqliteTimeStamp.result_processors%  " ",,T7GD Dr*Nr(rreturnz!Optional[_BindProcessorType[Any]]r(rr.objectr0z#Optional[_ResultProcessorType[Any]]__name__ __module__ __qualname__r%r,r*r)r!r!s8:$: *:E$E/5E ,Er*r!c0eZdZ ddZ ddZy)_SQLite_pysqliteDatecH|jrytj||Sr#)r$r r%r&s r)r%z#_SQLite_pysqliteDate.bind_processors#  " "&&tW5 5r*cJ|jrytj|||Sr#)r$r r,r-s r)r,z%_SQLite_pysqliteDate.result_processors%  " "((w@ @r*Nr/r1r3r7r*r)r9r9s86$6 *6A$A/5A ,Ar*r9cReZdZdZdZdZejeje je e jeiZ dZdZeddZeddZeddZddZej,j/d diZ dfd Zdd Zdd Zdd Z ddZxZS)SQLiteDialect_pysqliteqmarkTNpysqlitec&ddlm}td|S)Nr)dbapi2r)sqlite3rAr)clssqlites r) import_dbapiz#SQLiteDialect_pysqlite.import_dbapis,M6**r*cz|jr/|jdk7r |jjdddk7ryy)N:memory:modememoryTF)databasequerygetrCurls r)_is_url_file_dbz&SQLiteDialect_pysqlite._is_url_file_dbs1 LLS\\Z7 IIMM&$ '8 3r*cd|j|rtjStjSr#)rOr QueuePoolSingletonThreadPoolrMs r)get_pool_classz%SQLiteDialect_pysqlite.get_pool_classs&   s #>> !++ +r*c.|jjSr#)dbapisqlite_version_info)r' connections r)_get_server_version_infoz/SQLiteDialect_pysqlite._get_server_version_infoszz---r* AUTOCOMMITcL|dk(rd|_yd|_t| ||S)NrY)isolation_levelsuperset_isolation_level)r'dbapi_connectionlevel __class__s r)r^z*SQLiteDialect_pysqlite.set_isolation_levels2 L /3  ,/1  ,7./?G Gr*c|jduSr#)r\)r' dbapi_conns r)detect_autocommit_settingz0SQLiteDialect_pysqlite.detect_autocommit_setting s))T11r*cddtjr|jddk\rddinid fd }d fd }||gd fd }|S) Nc8|ytj||duSr#)research)abs r)regexpz1SQLiteDialect_pysqlite.on_connect..regexpsy99Q?$. .r*)r deterministicTc0|jddfiy)Nrk)create_function)r_create_func_kwrks r) set_regexpz5SQLiteDialect_pysqlite.on_connect..set_regexps# ,  , ,!V '5 r*cL|jddtjfiy)Nfloorr )rpmathrt)r_rqs r) floor_funcz5SQLiteDialect_pysqlite.on_connect..floor_func#s) -  , ,DJJ *8 r*c$D] }|| yr#r7)connfnfnss r)connectz2SQLiteDialect_pysqlite.on_connect..connect.s 4 r*)ristrrjz Optional[str]r0zOptional[bool])r_rr0None)rxrr0r})rpy38rX)r'rrrvr{rqrzrks @@@r) on_connectz!SQLiteDialect_pysqlite.on_connectsV / 9966t<F .t4NN   :& r*c|js$|js|js |jrt j d|ddt fdtfdtfdtfdt fdtfg}|j}i}|D]\}}tj||||  |jdd r_t||D]\}}j|d|j }r^|d d j#fd t%Dzz }n4|j xsd}|dk7rt&j(j+|}|j-d|j/| |g|fS)NzInvalid SQLite URL: z Valid SQLite URL forms are: sqlite:///:memory: (or, sqlite://) sqlite:///relative/path/to/file.db sqlite:////absolute/path/to/file.dburitimeoutr\ detect_typescheck_same_threadcached_statements)destF?&c32K|]}|d|yw)=Nr7).0keyuri_optss r) z=SQLiteDialect_pysqlite.create_connect_args..bs"$' 6srG)usernamepasswordhostportr ArgumentErrorboolfloatr|intrKrcoerce_kw_typerLdictpoprJjoinsortedospathabspath setdefaultrO) r'rN pysqlite_argsopts pysqlite_optsrtype_filenamers @r)create_connect_argsz*SQLiteDialect_pysqlite.create_connect_args4s~ <<3<<388sxx## ;> @ DM    $ S ! $ ' # &  yy(* ' FJC   c5} E F   UE *DzH, ( U S$' (LLHCHH#)(#3||1zH:%77??84  T%9%9#%>!>  M**r*ctd|j|_t||jjxr dt |vS)Nrz$Cannot operate on a closed database.)rrU isinstanceProgrammingErrorr|)r'erWcursors r) is_disconnectz$SQLiteDialect_pysqlite.is_disconnectrsC -4  tzz** ?4A> ?r*)r0r)rNrr0r)rNrr0ztype[pool.Pool])rWrr0r)r_rr`rr0r})rcrr0r)r0z!Callable[[DBAPIConnection], None]rNrr0r)rzDBAPIModule.ErrorrWz7Optional[Union[PoolProxiedConnection, DBAPIConnection]]rzOptional[DBAPICursor]r0r)r4r5r6default_paramstylesupports_statement_cachereturns_native_bytesr update_copyrcolspecssqltypesDater9 TIMESTAMPr!description_encodingdriver classmethodrErOrSrX_isolation_lookupunionr^rdrrr __classcell__ras@r)r=r=s #t MM/    9 H  F++ ,, .&77== $  H /H8FH H2#J<+| ?  ?L ?& ?  ?r*r=cXeZdZUdZdZdZdZdZdZde d<d fd Z d fd Z dd Z xZ S)_SQLiteDialect_pysqlite_numericznumeric dialect for testing only internal use only. This dialect is **NOT** supported by SQLAlchemy and may change at any time. Tnumericpysqlite_numericz:1NzOptional[Pattern[str]]_not_in_statement_regexpcH|jddt||i|y)N paramstylerrr]__init__r'argkwras r)rz(_SQLiteDialect_pysqlite_numeric.__init__s# lI. #$$r*cVt||\}}|j|d<||fS)Nfactory)r]r_fix_sqlite_issue_99953)r'rNrrras r)rz3_SQLiteDialect_pysqlite_numeric.create_connect_argss2G/4 T668YDyr*cddl|j|jr|jd fd nd d d dGfddjGfddj}|S) NrcZj|}|rJdjd|y)NzFound z in )rhpattern)sqlmniss r) _test_sqlzJ_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953.._test_sqls0JJsOAs{{oT#AAu1r*cyr#r7)rs r)rzJ_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953.._test_sqlsr*c|r;t|tsJt|dDcic]\}}t||c}}Sycc}}w)Nr r7)rtuple enumerater|) parametersidxvalues r)_numeric_param_as_dictzW_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953.._numeric_param_as_dictsL!*e4446? A6N(2UCHeO s?c<eZdZddfd Zdfd ZxZS)U_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953..SQLiteFix99953CursorcL||vr|}t|||Sr#r]executer'rrrarr first_binds r)rz]_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953..SQLiteFix99953Cursor.executes-#$!7 !CJwsJ77r*cp||vr|Dcgc] }| }}t|||Scc}wr#r] executemanyr'rrprarrrs r)rza_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953..SQLiteFix99953Cursor.executemanyK#$;E"67.q1"J"w*3 ;;"3r7)rr|rrr0r)r4r5r6rrr)rarrrs@r)SQLiteFix99953Cursorrs 8 8  < .SQLiteFix99953Connection_CursorT)boundc,|}t||S)N)r)r]r)r'rrras r)rz`_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953..SQLiteFix99953Connection.cursors ?2Gw~g~66r*cL||vr|}t|||Sr#rrs r)rza_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953..SQLiteFix99953Connection.executes/#$!7 !CJwsJ77r*cp||vr|Dcgc] }| }}t|||Scc}wr#rrs r)rze_SQLiteDialect_pysqlite_numeric._fix_sqlite_issue_99953..SQLiteFix99953Connection.executemanyrrr#)rz2Optional[Callable[[sqlite3.Connection], _CursorT]]r0rr)rr|rrr0zsqlite3.Cursor) r4r5r6r Cursorrrrrr)rarrrrrBs@r)SQLiteFix99953Connectionrsbz@H   7 7   735 8 8,/ 8 8 8 < > < < rsAD #  !440034!224EE$A4A$h?]h?V !_(&<_(D%%D%r*