The module contains a few objects and function extending the minimum set of functionalities defined by the DB API 2.0.
Is the class usually returned by the connect() function. It is exposed by the extensions module in order to allow subclassing to extend its behaviour: the subclass should be passed to the connect() function using the connection_factory parameter. See also Connection and cursor factories.
Subclasses should have constructor signature (dsn, async=0).
For a complete description of the class, see connection.
It is the class usually returnded by the connection.cursor() method. It is exposed by the extensions module in order to allow subclassing to extend its behaviour: the subclass should be passed to the cursor() method using the cursor_factory parameter. See also Connection and cursor factories.
For a complete description of the class, see cursor.
Wrapper for a PostgreSQL large object. See Access to PostgreSQL large objects for an overview.
The class can be subclassed: see the connection.lobject() to know how to specify a lobject subclass.
New in version 2.0.8.
Export the large object content to the file system.
The method uses the efficient lo_export() libpq function.
New in version 2.2.0.
Truncate the lobject to the given size.
The method will only be available if Psycopg has been built against libpq from PostgreSQL 8.3 or later and can only be used with PostgreSQL servers running these versions. It uses the lo_truncate() libpq function.
Register a callback function to block waiting for data.
The callback should have signature fun(conn) and is called to wait for data available whenever a blocking function from the libpq is called. Use set_wait_callback(None) to revert to the original behaviour (i.e. using blocking libpq functions).
The function is an hook to allow coroutine-based libraries (such as Eventlet or gevent) to switch when Psycopg is blocked, allowing other coroutines to run concurrently.
See wait_select() for an example of a wait callback implementation.
New in version 2.2.0.
Return the currently registered wait callback.
Return None if no callback is currently registered.
New in version 2.2.0.
Psycopg provides a flexible system to adapt Python objects to the SQL syntax (inspired to the PEP 246), allowing serialization in PostgreSQL. See Adapting new Python types to SQL syntax for a detailed description. The following objects deal with Python objects adaptation:
Return the SQL representation of obj as a string. Raise a ProgrammingError if how to adapt the object is unknown. In order to allow new objects to be adapted, register a new adapter for it using the register_adapter() function.
The function is the entry point of the adaptation mechanism: it can be used to write adapters for complex objects by recursively calling adapt() on its components.
Register a new adapter for the objects of class class.
adapter should be a function taking a single argument (the object to adapt) and returning an object conforming the ISQLQuote protocol (e.g. exposing a getquoted() method). The AsIs is often useful for this task.
Once an object is registered, it can be safely used in SQL queries and by the adapt() function.
Represents the SQL adaptation protocol. Objects conforming this protocol should implement a getquoted() method.
Adapters may subclass ISQLQuote, but is not necessary: it is enough to expose a getquoted() method to be conforming.
Adapter conform to the ISQLQuote protocol useful for objects whose string representation is already valid as SQL representation.
Return the str() conversion of the wrapped object.
>>> AsIs(42).getquoted()
'42'
Adapter conform to the ISQLQuote protocol for string-like objects.
Return the string enclosed in single quotes. Any single quote appearing in the the string is escaped by doubling it according to SQL string constants syntax. Backslashes are escaped too.
>>> QuotedString(r"O'Reilly").getquoted()
"'O''Reilly'"
Adapter conform to the ISQLQuote protocol for binary objects.
Return the string enclosed in single quotes. It performs the same escaping of the QuotedString adapter, plus it knows how to escape non-printable chars.
>>> Binary("\x00\x08\x0F").getquoted()
"'\\\\000\\\\010\\\\017'"
Changed in version 2.0.14: previously the adapter was not exposed by the extensions module. In older versions it can be imported from the implementation module psycopg2._psycopg.
These functions are used to manipulate type casters to convert from PostgreSQL types to Python objects. See Type casting of SQL types into Python objects for details.
Create a new type caster to convert from a PostgreSQL type to a Python object. The created object must be registered using register_type() to be used.
Parameters: |
|
---|
The object OID can be read from the cursor.description attribute or by querying from the PostgreSQL catalog.
adapter should have signature fun(value, cur) where value is the string representation returned by PostgreSQL and cur is the cursor from which data are read. In case of NULL, value will be None. The adapter should return the converted object.
See Type casting of SQL types into Python objects for an usage example.
Register a type caster created using new_type().
If scope is specified, it should be a connection or a cursor: the type caster will be effective only limited to the specified object. Otherwise it will be globally registered.
The module exports a few exceptions in addition to the standard ones defined by the DB API 2.0.
(subclasses OperationalError)
Error related to SQL query cancelation. It can be trapped specifically to detect a timeout.
New in version 2.0.7.
(subclasses OperationalError)
Error causing transaction rollback (deadlocks, serialisation failures, etc). It can be trapped specifically to detect a deadlock.
New in version 2.0.7.
Psycopg2 connection objects hold informations about the PostgreSQL transaction isolation level. The current transaction level can be read from the isolation_level attribute. The default isolation level is READ COMMITTED. A different isolation level con be set through the set_isolation_level() method. The level can be set to one of the following constants:
No transaction is started when command are issued and no commit() or rollback() is required. Some PostgreSQL command such as CREATE DATABASE or VACUUM can’t run into a transaction: to run such command use:
>>> conn.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
See also Transactions control.
These values represent the possible status of a transaction: the current value can be read using the connection.get_transaction_status() method.
These values represent the possible status of a connection: the current value can be read from the status attribute.
It is possible to find the connection in other status than the one shown below. Those are the only states in which a working connection is expected to be found during the execution of regular Python client code: other states are for internal usage and Python code should not rely on them.
New in version 2.2.0.
These values can be returned by connection.poll() during asynchronous connection and communication. They match the values in the libpq enum PostgresPollingStatusType. See Asynchronous support and Support to coroutine libraries.
Some data is being read from the backend, but it is not available yet on the client and reading would block. Upon receiving this value, the client should wait for the connection file descriptor to be ready for reading. For example:
select.select([conn.fileno()], [], [])
Some data is being sent to the backend but the connection file descriptor can’t currently accept new data. Upon receiving this value, the client should wait for the connection file descriptor to be ready for writing. For example:
select.select([], [conn.fileno()], [])
The extensions module includes typecasters for many standard PostgreSQL types. These objects allow the conversion of returned data into Python objects. All the typecasters are automatically registered, except UNICODE and UNICODEARRAY: you can register them using register_type() in order to receive Unicode objects instead of strings from the database. See Unicode handling for details.
Changed in version 2.2.0: previously the DECIMAL typecaster and the specific time-related typecasters (PY* and MX*) were not exposed by the extensions module. In older versions they can be imported from the implementation module psycopg2._psycopg.