pycroft.model

pycroft.model

This package contains basic stuff for db actions.

copyright
  1. 2011 by AG DSN.

class UTCTZInfoFactory(offset)[source]

A tzinfo factory compatible with psycopg2.tz.FixedOffsetTimezone, that checks if the provided UTC offset is zero and returns datetime.timezone.utc. If the offset is not zero an psycopg2.DataError is raised.

This class is implemented as a singleton that always returns the same instance.

class UTCTZInfoCursorFactory(*args, **kwargs)[source]

A Cursor factory that sets the psycopg2.extensions.cursor.tzinfo_factory to UTCTZInfoFactory.

The C implementation of the cursor class does not use the proper Python attribute lookup, therefore we have to set the instance variable rather than use a class attribute.

create_engine(connection_string, **kwargs) Engine[source]
create_db_model(bind: Connection) None[source]

Create all models in the database.

drop_db_model(bind: Connection) None[source]

Drop all models from the database.

pycroft.model._all

Dummy module to get all the mapped stuff in one namespace. This is necessary for things like sqlalchemy-schemadisplay.

copyright
  1. 2011 by AG DSN.

class Address(**kwargs)[source]

A known address.

Addresses differ from most other entities such as users or rooms in the following ways:

  • Their identity is provided by their value, i.e. if two addresses have equal values, they should be identitcal

  • Their existence is justified solely by the reference of another object. At no point in time should there be any unreferenced address records in the db.

  • They should be immutable: This implies that editing e.g. the street of a user’s address should not change the street of the corresponding room’s address. This implies that addresses are stateless, i.e. have no life cycle.

Establishing these consistencies requires triggers.

street: Mapped[str]
number: Mapped[str]
addition: Mapped[str]
zip_code: Mapped[str]
city: Mapped[str]
state: Mapped[str]
country: Mapped[str]
rooms: Mapped[list[Room]]
inhabitants: Mapped[list[User]]
id: Mapped[int]

pycroft.model.base

This module contains base stuff for all models.

copyright
  1. 2011 by AG DSN.

class ModelBase(**kwargs: Any)[source]

Base class for all database models.

type_annotation_map = {<class 'netaddr.ip.IPAddress'>: <class 'pycroft.model.types.IPAddress'>, <class 'netaddr.ip.IPNetwork'>: <class 'pycroft.model.types.IPNetwork'>, pycroft.helpers.utc.DateTimeTz: <class 'pycroft.model.types.DateTimeTz'>, typing.Annotated[str, 10]: <class 'pycroft.model.types.MACAddress'>, typing.Annotated[str, 255]: String(length=255), typing.Annotated[str, 40]: String(length=40), typing.Annotated[str, 50]: String(length=50)}
typing = <module 'typing' from '/opt/hostedtoolcache/Python/3.12.7/x64/lib/python3.12/typing.py'>
classmethod get(*a, **kw)[source]

This is a shortcut for session.get(cls, –)

metadata: ClassVar[MetaData] = MetaData()

Refers to the _schema.MetaData collection that will be used for new _schema.Table objects.

registry: ClassVar[_RegistryType] = <sqlalchemy.orm.decl_api.registry object>

Refers to the _orm.registry in use where new _orm.Mapper objects will be associated.

class MappedAsDataclass[source]

MappedAsDataclass, but with metaclass which includes our custom metaclass.

This exists because the following does not work:

from sqlalchemy import
class Foo(MappedAsDataclass, ModelBase):
    ...

The reason is that MappedAsDataclass implements its functionality with its own metaclass. However, since classes can only have one metaclass, the metaclass of MappedAsDataclass subclasses`DeclarativeMeta`.

In our case, this is not sufficient, since our ModelBase uses a custom metaclass _ModelMeta for the (legacy) .q shorthand; to fix this, we create a new metaclass inheriting from both type(MappedAsDataclass) and _ModelMeta.

class IntegerIdModel(**kwargs: Any)[source]

Abstract base class for database models with an Integer primary column, named id.

id: Mapped[int] = <sqlalchemy.orm.properties.MappedColumn object>

pycroft.model.config

class Config(**kwargs)[source]
member_group_id: Mapped[int]
member_group: Mapped[PropertyGroup]
network_access_group_id: Mapped[int]
network_access_group: Mapped[PropertyGroup]
violation_group_id: Mapped[int]
violation_group: Mapped[PropertyGroup]
external_group_id: Mapped[int]
external_group: Mapped[PropertyGroup]
blocked_group_id: Mapped[int]
blocked_group: Mapped[PropertyGroup]
caretaker_group_id: Mapped[int]
caretaker_group: Mapped[PropertyGroup]
treasurer_group_id: Mapped[int]
treasurer_group: Mapped[PropertyGroup]
pre_member_group_id: Mapped[int]
pre_member_group: Mapped[PropertyGroup]
traffic_limit_exceeded_group_id: Mapped[int]
traffic_limit_exceeded_group: Mapped[PropertyGroup]
payment_in_default_group_id: Mapped[int]
payment_in_default_group: Mapped[PropertyGroup]
membership_fee_account_id: Mapped[int]
membership_fee_account: Mapped[Account]
membership_fee_bank_account_id: Mapped[int]
membership_fee_bank_account: Mapped[BankAccount]
fints_product_id: Mapped[str | None]
id: Mapped[int]

pycroft.model.ddl

compile_if_clause(compiler: Compiled, clause: Any) Any[source]
class DropConstraint(element: Constraint, if_exists: bool = False, cascade: bool = False, **kw: Any)[source]

Extends SQLALchemy’s DropConstraint with support for IF EXISTS

visit_drop_constraint(drop_constraint: DropConstraint, compiler: Compiled, **kw)[source]
class LazilyComiledDefDescriptor[source]
class Function(name: str, arguments: Iterable[str], rtype: str, definition: pycroft.model.ddl.LazilyComiledDefDescriptor = <pycroft.model.ddl.LazilyComiledDefDescriptor object at 0x7fc012fd0d70>, volatility: Literal['volatile', 'stable', 'immutable'] = 'volatile', strict: bool = False, leakproof: bool = False, language: str = 'sql', quote_tag: str = '')[source]
name: str

Name of the function (excluding arguments).

arguments: Iterable[str]

Arguments of the function. A function identifier of new_function(integer, integer) would result in arguments=['integer', 'integer'].

rtype: str

Return type

definition: LazilyComiledDefDescriptor = None

Definition

volatility: Literal['volatile', 'stable', 'immutable'] = 'volatile'
strict: bool = False

Function should be declared STRICT

leakproof: bool = False

Function should be declared LEAKPROOF

language: str = 'sql'

Language the function is defined in (e.g. plpgsql for trigger functions)

quote_tag: str = ''

Dollar quote tag to enclose the function definition

build_quoted_identifier(quoter: Callable[[str], str]) str[source]

Compile the function identifier from name and arguments.

Parameters

quoter – A callable that quotes the function name

Returns

The compiled string, like "my_function_name"(integer, account_type)

class CreateFunction(func: Function, or_replace: bool = False)[source]

Represents a CREATE FUNCTION DDL statement

func: Function
property function: Function
or_replace: bool = False
class DropFunction(func: Function, if_exists: bool = False, cascade: bool = False)[source]

Represents a DROP FUNCTION DDL statement

visit_create_function(element: CreateFunction, compiler: Compiled, **kw: Any) str[source]

Compile a CREATE FUNCTION DDL statement for PostgreSQL

visit_drop_function(element: DropFunction, compiler: Compiled, **kw: Any) str[source]

Compile a DROP FUNCTION DDL statement for PostgreSQL

class Rule(name: str, table: sqlalchemy.sql.schema.Table, event: str, command_or_commands: str | Sequence[str], condition: str | None = None, do_instead: bool = False)[source]
name: str
table: Table
event: str
command_or_commands: str | Sequence[str]
commands: tuple[str, ...]
condition: str | None = None
do_instead: bool = False
class CreateRule(rule: Rule, or_replace: bool = False)[source]

Represents a CREATE RULE DDL statement

rule: Rule
or_replace: bool = False
class DropRule(rule: Rule, if_exists: bool = False, cascade: bool = False)[source]

Represents a DROP RULE DDL statement

rule: Rule
if_exists: bool = False
cascade: bool = False
visit_create_rule(element: CreateRule, compiler: Compiled, **kw: Any) str[source]

Compile a CREATE RULE DDL statement for PostgreSQL.

visit_drop_rule(element: DropRule, compiler: Compiled, **kw: Any) str[source]

Compile a DROP RULE DDL statement for PostgreSQL

class Trigger(name: str, table: sqlalchemy.sql.schema.Table, events: Sequence[str], function_call: str, when: Literal['BEFORE', 'AFTER', 'INSTEAD OF'] = 'AFTER')[source]
name: str

Name of the trigger

table: Table

Table the trigger is for

events: Sequence[str]

list of events (INSERT, UPDATE, DELETE)

function_call: str

call of the trigger function

when: Literal['BEFORE', 'AFTER', 'INSTEAD OF'] = 'AFTER'

Mode of execution

class ConstraintTrigger(name: str, table: sqlalchemy.sql.schema.Table, events: Sequence[str], function_call: str, when: Literal['BEFORE', 'AFTER', 'INSTEAD OF'] = 'AFTER', deferrable: bool = False, initially_deferred: bool = False)[source]
deferrable: bool = False

Constraint can be deferred

initially_deferred: bool = False

Constrait is ste to deferred

class CreateTrigger(trigger: pycroft.model.ddl.Trigger)[source]
trigger: Trigger
class CreateConstraintTrigger(constraint_trigger: ConstraintTrigger)[source]

Represents a CREATE CONSTRAINT TRIGGER DDL statement

constraint_trigger: ConstraintTrigger
class DropTrigger(trigger: Trigger, if_exists: bool = False, cascade: bool = False)[source]

Represents a DROP TRIGGER DDL statement.

trigger: Trigger
if_exists: bool = False
cascade: bool = False
create_add_constraint_trigger(element: CreateConstraintTrigger, compiler: Compiled, **kw: Any) str[source]

Compile a CREATE CONSTRAINT TRIGGER DDL statement for PostgreSQL

create_add_trigger(element: CreateTrigger, compiler: Compiled, **kw: Any) str[source]

Compile a CREATE CONSTRAINT TRIGGER DDL statement for PostgreSQL

visit_drop_trigger(element: DropTrigger, compiler: Compiled, **kw: Any) str[source]

Compile a DROP TRIGGER DDL statement for PostgreSQL

class View(name: str, query: sqlalchemy.sql.selectable.SelectBase, column_names: Sequence[str] = None, temporary: bool = False, view_options: Mapping[str, Any] = None, check_option: Literal['local', 'cascaded'] | None = None, materialized: bool = False)[source]
name: str

The name of the view

query: SelectBase

the query it represents

column_names: Sequence[str] = None
temporary: bool = False
view_options: Mapping[str, Any] = None

Must be something that can be passed to OrderedDict, so a simple dict suffices.

check_option: Literal['local', 'cascaded'] | None = None

Must be one of None, 'local', 'cascaded'.

materialized: bool = False

Is materialized view

table: Table
refresh(concurrently=False)[source]

Refreshes the current materialized view

class CreateView(view: pycroft.model.ddl.View, or_replace: bool = False, if_not_exists: bool = False)[source]
view: View
or_replace: bool = False
if_not_exists: bool = False
class DropView(view: pycroft.model.ddl.View, if_exists: bool = False, cascade: bool = False)[source]
view: View
if_exists: bool = False
cascade: bool = False
visit_create_view(element: CreateView, compiler: Compiled, **kw: Any) str[source]
visit_drop_view(element: DropView, compiler: Compiled, **kw: Any) str[source]
class DDLManager[source]

Ensures that create DDL statements are registered with SQLAlchemy in the order they were added to the manager and registers the drop DDL statements in the reverse order.

Example usage:

>>> from sqlalchemy import MetaData, Table, Column as C, Integer as I, String as S
>>> table = Table('table', MetaData(), C('id', I, primary_key=True), C('name', S))
>>> manager = DDLManager()
>>> view = View('my_view', "select concat(name, ' hat das Spiel verloren') from table")
>>> manager.add_view(table, view)
>>> # … do other stuff
>>> manager.register()
add(target: Table, create_ddl: ExecutableDDLElement, drop_ddl: ExecutableDDLElement, dialect: str | None = None)[source]
add_constraint(table: Table, constraint: Constraint, dialect: str | None = None) None[source]
add_function(table: Table, func: Function, dialect: str | None = None) None[source]
add_rule(table: Table, rule: Rule, dialect: str | None = None) None[source]
add_trigger(table: Table, trigger: Trigger, dialect: str | None = None) None[source]
add_constraint_trigger(table: Table, constraint_trigger: ConstraintTrigger, dialect: str | None = None) None[source]
add_view(table: Table, view: View, dialect: str | None = None, or_replace: bool = True, if_not_exists: bool = True) None[source]
register() None[source]

pycroft.model.exc

exception PycroftModelException[source]

pycroft.model.facilities

class Site(**kwargs)[source]
name: Mapped[str]
buildings: Mapped[list[Building]]
id: Mapped[int]
class Building(**kwargs)[source]
site_id: Mapped[int]
site: Mapped[Site]
number: Mapped[str]
short_name: Mapped[str]
street: Mapped[str]
wifi_available: Mapped[bool]
fee_account_id: Mapped[int]
fee_account: Mapped[Account]
swdd_haus_id: Mapped[int | None]
rooms: Mapped[list[Room]]
property street_and_number
id: Mapped[int]
class Room(**kwargs)[source]
number: Mapped[str]
level: Mapped[int]
inhabitable: Mapped[bool]
building_id: Mapped[int]
building: Mapped[Building]
address_id: Mapped[int]
address: Mapped[Address]
swdd_vo_suchname: Mapped[str | None]
connected_patch_ports: Mapped[list[PatchPort]]
users_sharing_address: Mapped[list[User]]
users: Mapped[list[User]]
hosts: Mapped[list[Host]]
room_history_entries: Mapped[list[RoomHistoryEntry]]
log_entries: Mapped[list[RoomLogEntry]]
patch_ports: Mapped[list[PatchPort]]
tenancies: Mapped[list[Tenancy]]
property short_name
property level_and_number
property is_switch_room
property latest_log_entry: RoomLogEntry | None
id: Mapped[int]

pycroft.model.finance

class MembershipFee(**kwargs)[source]
name: Mapped[str]
regular_fee: Mapped[int]
booking_begin: Mapped[timedelta]
booking_end: Mapped[timedelta]
payment_deadline: Mapped[timedelta]
payment_deadline_final: Mapped[timedelta]
begins_on: Mapped[date]
ends_on: Mapped[date]
id: Mapped[int]
class Semester(**kwargs)[source]
name: Mapped[str]
registration_fee: Mapped[int]
regular_semester_fee: Mapped[int]
reduced_semester_fee: Mapped[int]
late_fee: Mapped[int]
grace_period: Mapped[timedelta]
reduced_semester_fee_threshold: Mapped[timedelta]
payment_deadline: Mapped[timedelta]
allowed_overdraft: Mapped[int]
begins_on: Mapped[date]
ends_on: Mapped[date]
id: Mapped[int]
class Account(**kwargs)[source]
name: Mapped[str127]
type: Mapped[AccountType]
legacy: Mapped[bool]
splits: Mapped[list[Split]]
user: Mapped[User | None]
building: Mapped[Building | None]
patterns: Mapped[list[AccountPattern]]
transactions: Mapped[list[Transaction]]
balance
property in_default_days
id: Mapped[int]
class AccountPattern(**kwargs)[source]
pattern: Mapped[str]
account_id: Mapped[int]
account: Mapped[Account]
id: Mapped[int]
class Transaction(**kwargs)[source]
description: Mapped[str]
author_id: Mapped[int | None]
author: Mapped[User | None]
posted_at: Mapped[datetime_tz_onupdate]
valid_on: Mapped[date]
confirmed: Mapped[bool]
splits: Mapped[list[Split]]
bank_account_activities: Mapped[list[BankAccountActivity]]
accounts: Mapped[list[Account]]
property amount
property is_balanced
property is_simple
id: Mapped[int]
class Split(**kwargs)[source]
amount: Mapped[int]
account_id: Mapped[int]
account: Mapped[Account]
transaction_id: Mapped[int]
transaction: Mapped[Transaction]
bank_account_activity: Mapped[BankAccountActivity | None]
id: Mapped[int]
exception IllegalTransactionError[source]

Indicates an attempt to persist an illegal Transaction.

check_transaction_on_save(mapper, connection, target)[source]

Check transaction constraints.

Transaction must be balanced, an account mustn’t be referenced by more than one split and it must consist of at least two splits. :raises: IllegalTransactionError if transaction contains errors

check_split_on_update(mapper, connection, target)[source]
class BankAccount(**kwargs)[source]
name: Mapped[str255]
bank: Mapped[str255]
owner: Mapped[str255]
account_number: Mapped[str]
routing_number: Mapped[str]
iban: Mapped[str]
bic: Mapped[str]
fints_endpoint: Mapped[str]
account_id: Mapped[int]
account: Mapped[Account]
activities: Mapped[list[BankAccountActivity]]
mt940_errors: Mapped[list[MT940Error]]
balance
property last_imported_at: DateTimeTz
id: Mapped[int]
class BankAccountActivity(**kwargs)[source]
bank_account_id: Mapped[int]
bank_account: Mapped[BankAccount]
amount: Mapped[int]
reference: Mapped[str]
other_account_number: Mapped[str255]
other_routing_number: Mapped[str255]
other_name: Mapped[str255]
imported_at: Mapped[utc.DateTimeTz]
posted_on: Mapped[date]
valid_on: Mapped[date]
transaction_id: Mapped[int | None]
transaction: Mapped[Transaction | None]
account_id: Mapped[int | None]
account: Mapped[Account | None]
split: Mapped[Split]
matching_patterns: Mapped[list[AccountPattern]]
id: Mapped[int]
class MT940Error(**kwargs)[source]
mt940: Mapped[str]
exception: Mapped[str]
author_id: Mapped[int]
author: Mapped[User]
imported_at: Mapped[datetime_tz_onupdate]
bank_account: Mapped[BankAccount]
bank_account_id: Mapped[int]
id: Mapped[int]

pycroft.model.functions

Add support for various functions present in Postgres to the SQLite SQLAlchemy dialect.

class greatest(*clauses: _ColumnExpressionOrLiteralArgument[Any])[source]
inherit_cache: bool | None = True

Indicate if this HasCacheKey instance should make use of the cache key generation scheme used by its immediate superclass.

The attribute defaults to None, which indicates that a construct has not yet taken into account whether or not its appropriate for it to participate in caching; this is functionally equivalent to setting the value to False, except that a warning is also emitted.

This flag can be set to True on a particular class, if the SQL that corresponds to the object does not change based on attributes which are local to this class, and not its superclass.

See also

Enabling Caching Support for Custom Constructs - General guideslines for setting the HasCacheKey.inherit_cache attribute for third-party or user defined SQL constructs.

type: TypeEngine[_T] = Numeric()
name = 'greatest'
class least(*clauses: _ColumnExpressionOrLiteralArgument[Any])[source]
inherit_cache: bool | None = True

Indicate if this HasCacheKey instance should make use of the cache key generation scheme used by its immediate superclass.

The attribute defaults to None, which indicates that a construct has not yet taken into account whether or not its appropriate for it to participate in caching; this is functionally equivalent to setting the value to False, except that a warning is also emitted.

This flag can be set to True on a particular class, if the SQL that corresponds to the object does not change based on attributes which are local to this class, and not its superclass.

See also

Enabling Caching Support for Custom Constructs - General guideslines for setting the HasCacheKey.inherit_cache attribute for third-party or user defined SQL constructs.

type: TypeEngine[_T] = Numeric()
name = 'least'
class sign(*clauses: _ColumnExpressionOrLiteralArgument[Any])[source]
inherit_cache: bool | None = True

Indicate if this HasCacheKey instance should make use of the cache key generation scheme used by its immediate superclass.

The attribute defaults to None, which indicates that a construct has not yet taken into account whether or not its appropriate for it to participate in caching; this is functionally equivalent to setting the value to False, except that a warning is also emitted.

This flag can be set to True on a particular class, if the SQL that corresponds to the object does not change based on attributes which are local to this class, and not its superclass.

See also

Enabling Caching Support for Custom Constructs - General guideslines for setting the HasCacheKey.inherit_cache attribute for third-party or user defined SQL constructs.

type: TypeEngine[_T] = Integer()
name = 'sign'
compile_default_function(element, compiler, **kw)[source]
compile_sqlite_greatest(element, compiler, **kw)[source]
compile_sqlite_least(element, compiler, **kw)[source]
compile_sqlite_sign(element, compiler, **kw)[source]

pycroft.model.hades

pycroft.model.host

class Host(**kwargs)[source]
name: Mapped[str | None]
owner_id: Mapped[int | None]
owner: Mapped[User]
room_id: Mapped[int | None]
room: Mapped[Room | None]
interfaces: Mapped[list[Interface]]
ips: Mapped[list[IP]]
switch: Mapped[Switch | None]
id: Mapped[int]
class Switch(**kwargs)[source]

A switch with a name and mgmt-ip

A Switch is directly tied to a Host because instead of having an id column, the primary key is host_id, a foreign key on a Host.

host_id: Mapped[int]
host: Mapped[Host]
management_ip: Mapped[netaddr.IPAddress]
ports: Mapped[list[SwitchPort]]
exception MulticastFlagException[source]
message = 'Multicast bit set in MAC address'
exception TypeMismatch[source]
class Interface(**kwargs)[source]

A logical network interface (hence the single MAC address).

This means many net interfaces can be connected to the same switch port.

It has to be bound to a UserHost, not another kind of host (like Switch)

name: Mapped[str | None]
mac: Mapped[mac_address]
host_id: Mapped[int]
host: Mapped[Host]
ips: Mapped[list[IP]]
validate_mac(_, mac_address)[source]
id: Mapped[int]
class SwitchPort(**kwargs)[source]
switch_id: Mapped[int]
switch: Mapped[Switch]
name: Mapped[str]
default_vlans: Mapped[list[VLAN]]

These are the VLANs that should theoretically be available at this switch port. It is only used to calculate the pool of IPs to choose from e.g. when adding a user or migrating a host, and does not influence any functionality beyond that.

patch_port: Mapped[PatchPort | None]
id: Mapped[int]
class IP(**kwargs)[source]
address: Mapped[netaddr.IPAddress]
interface_id: Mapped[int]
interface: Mapped[Interface]
subnet_id: Mapped[int]
subnet: Mapped[Subnet]
traffic_volumes: Mapped[list[TrafficVolume]]
host: Mapped[Host]
validate_subnet(_, value)[source]
validate_address(_, value)[source]
id: Mapped[int]

pycroft.model.logging

This module contains the classes LogEntry, UserLogEntry, TrafficVolume.

copyright
  1. 2011 by AG DSN.

class LogEntry(**kwargs)[source]
discriminator: Mapped[str50 | None]
message: Mapped[str]
created_at: Mapped[datetime_tz]
author_id: Mapped[int]
author: Mapped[User]
id: Mapped[int]
class TaskLogEntry(**kwargs)[source]
id: Mapped[int]
task_id: Mapped[int]
task: Mapped[Task]
user: Mapped[User]
author: Mapped[User]
author_id: Mapped[int]
created_at: Mapped[datetime_tz]
discriminator: Mapped[str50 | None]
message: Mapped[str]
class UserLogEntry(**kwargs)[source]
id: Mapped[int]
user_id: Mapped[int]
user: Mapped[User]
author: Mapped[User]
author_id: Mapped[int]
created_at: Mapped[datetime_tz]
discriminator: Mapped[str50 | None]
message: Mapped[str]
class RoomLogEntry(**kwargs)[source]
id: Mapped[int]
room_id: Mapped[int]
author: Mapped[User]
author_id: Mapped[int]
created_at: Mapped[datetime_tz]
discriminator: Mapped[str50 | None]
message: Mapped[str]
room: Mapped[Room]

pycroft.model.net

class VLAN(**kwargs)[source]
name: Mapped[str127]
vid: Mapped[int]
switch_ports
subnets
id: Mapped[int]
class Subnet(**kwargs)[source]
address: Mapped[netaddr.IPNetwork]
gateway: Mapped[netaddr.IPAddress | None]
reserved_addresses_bottom: Mapped[int]
reserved_addresses_top: Mapped[int]
description: Mapped[str50 | None]
vlan_id: Mapped[int]
vlan: Mapped[VLAN]
ips: Mapped[list[IP]]
property reserved_ipset: IPSet
reserved_ip_ranges_iter() Iterator[IPRange][source]
property usable_ip_range: IPRange | None

All IPs in this subnet which are not reserved.

property usable_size: int

The number of IPs in this subnet which are not reserved.

unused_ips_iter() Iterator[IPAddress][source]
id: Mapped[int]

pycroft.model.port

class PatchPort(**kwargs)[source]

A patch panel port that may or not be connected to a switch

id: Mapped[int]
name
switch_port_id: Mapped[int | None]
switch_port: Mapped[SwitchPort]
room_id: Mapped[int]
room: Mapped[Room]
switch_room_id: Mapped[int]
switch_room: Mapped[Room]

pycroft.model.property

This module contains model descriptions concerning properties, groups, and memberships.

evaluate_properties(when: datetime | None = None, name='properties') TableValuedAlias[source]

A sqlalchemy func wrapper for the evaluate_properties PSQL function.

See sqlalchemy.sql.selectable.FromClause.table_valued.

class CurrentProperty(**kwargs)[source]
user_id: Mapped[int]
property_name: Mapped[str]
denied: Mapped[bool]

pycroft.model.session

This module contains the session stuff for db actions.

copyright
  1. 2011 by AG DSN.

class NullScopedSession[source]
remove()[source]
set_scoped_session(scoped_session: scoped_session[Session]) None[source]
with_transaction(wrapped: F) F[source]
utcnow() DateTimeTz[source]
current_timestamp() AnsiFunction[DateTimeTz][source]

pycroft.model.swdd

class TenancyStatus(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]
PROVISIONAL = 1
ESTABLISHED = 2
UNDO_PROVISIONAL = 3
UNDO_FINAL = 4
CANCELED = 5
class RentalObject(**kwargs)[source]
vo_id: Mapped[int]
suchname: Mapped[str]
name: Mapped[str]
voart_id: Mapped[int | None]
nutzungsart_id: Mapped[int | None]
nutzbarvon: Mapped[date]
nutzbarbis: Mapped[date]
status: Mapped[int | None]
wohneim_id: Mapped[int | None]
wohneim_suchname: Mapped[int | None]
wohneim_name: Mapped[str]
stockwerk_id: Mapped[int | None]
stockwerk: Mapped[str]
stockwerk_name: Mapped[str]
haus_id: Mapped[int | None]
haus_name: Mapped[str]
class Tenancy(**kwargs)[source]
persvv_id: Mapped[int]
person_id: Mapped[int | None]
user: Mapped[User | None]
pre_member: Mapped[PreMember | None]
vo_suchname: Mapped[str]
room: Mapped[Room]
person_hash: Mapped[str]
mietbeginn: Mapped[date]
mietende: Mapped[date]
status_id: Mapped[int]
property status

pycroft.model.task

class TaskType(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]
USER_MOVE_OUT = 1
USER_MOVE_IN = 2
USER_MOVE = 3
class TaskStatus(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]
OPEN = 1
EXECUTED = 2
FAILED = 3
CANCELLED = 4
class Task(**kwargs)[source]

The task model

The task model needs to hold three types of data:

  • Metadata (creation, status, …)

  • A type (e.g. USER_MOVE)

  • the parameters_json json dict.

The parameters should actually be accessed via parameters, as this already takes care of validation and (de-)serialization. The type field is essentially only needed for filtering in a query.

discriminator: Mapped[str50]
type: Mapped[TaskType]
due: Mapped[utc.DateTimeTz]
parameters_json: Mapped[t.Any]
created: Mapped[utc.DateTimeTz]
creator_id: Mapped[int]
creator: Mapped[User]
status: Mapped[TaskStatus]
errors: Mapped[t.Any | None]
log_entries: Mapped[list[TaskLogEntry]]
property schema: type[Schema]
property parameters: TParams

(Lazily) deserialized dict corresponding to the parameters.

The deserialization happens according to what schema is referenced in self.schema.

property latest_log_entry: TaskLogEntry | None
id: Mapped[int]
class UserTask(**kwargs)[source]
id: Mapped[int]
user_id: Mapped[int]
user: Mapped[User]
created: Mapped[utc.DateTimeTz]
creator: Mapped[User]
creator_id: Mapped[int]
discriminator: Mapped[str50]
due: Mapped[utc.DateTimeTz]
errors: Mapped[t.Any | None]
log_entries: Mapped[list[TaskLogEntry]]
parameters_json: Mapped[t.Any]
status: Mapped[TaskStatus]
type: Mapped[TaskType]

pycroft.model.task_serialization

class TaskParams[source]
handle_validation_error(wrapper=None, enabled=None, adapter=None, proxy=<class 'FunctionWrapper'>) T[source]
class UserMoveOutSchema(*, only: Sequence[str] | AbstractSet[str] | None = None, exclude: Sequence[str] | AbstractSet[str] = (), many: bool = False, context: dict | None = None, load_only: Sequence[str] | AbstractSet[str] = (), dump_only: Sequence[str] | AbstractSet[str] = (), partial: bool | Sequence[str] | AbstractSet[str] | None = None, unknown: str | None = None)[source]
build(data, **kwargs)[source]
opts: SchemaOpts = <marshmallow.schema.SchemaOpts object>
class UserMoveOutParams(comment: str, end_membership: bool)[source]
comment: str
end_membership: bool
class UserMoveSchema(*, only: Sequence[str] | AbstractSet[str] | None = None, exclude: Sequence[str] | AbstractSet[str] = (), many: bool = False, context: dict | None = None, load_only: Sequence[str] | AbstractSet[str] = (), dump_only: Sequence[str] | AbstractSet[str] = (), partial: bool | Sequence[str] | AbstractSet[str] | None = None, unknown: str | None = None)[source]
build(data, **kwargs)[source]
opts: SchemaOpts = <marshmallow.schema.SchemaOpts object>
class UserMoveParams(room_number: str, level: int, building_id: int, comment: str | None = None)[source]
room_number: str
level: int
building_id: int
comment: str | None = None
class UserMoveInSchema(*, only: Sequence[str] | AbstractSet[str] | None = None, exclude: Sequence[str] | AbstractSet[str] = (), many: bool = False, context: dict | None = None, load_only: Sequence[str] | AbstractSet[str] = (), dump_only: Sequence[str] | AbstractSet[str] = (), partial: bool | Sequence[str] | AbstractSet[str] | None = None, unknown: str | None = None)[source]
build(data, **kwargs)[source]
opts: SchemaOpts = <marshmallow.schema.SchemaOpts object>
class UserMoveInParams(room_number: str, level: int, building_id: int, mac: str | None = None, birthdate: datetime.date | None = None, begin_membership: bool = True, host_annex: bool = False)[source]
room_number: str
level: int
building_id: int
mac: str | None = None
birthdate: date | None = None
begin_membership: bool = True
host_annex: bool = False

pycroft.model.traffic

class TrafficVolume(**kwargs)[source]
timestamp: Mapped[DateTimeTz]
amount: Mapped[int]
type: Mapped[Literal['Ingress', 'Egress']]
ip_id: Mapped[int]
ip: Mapped[IP]
user_id: Mapped[int]
user: Mapped[User]
packets: Mapped[int]
traffic_history_query()[source]
traffic_history(user_id: int, start: DateTimeTz | ColumnElement[DateTimeTz], end: DateTimeTz | ColumnElement[DateTimeTz], name='traffic_history') TableValuedAlias[source]

A sqlalchemy func wrapper for the evaluate_properties PSQL function.

See sqlalchemy.sql.selectable.FromClause.table_valued.

class TrafficHistoryEntry(timestamp: DateTimeTz, ingress: int | None, egress: int | None)[source]

pycroft.model.types

class IPAddress(*args: Any, **kwargs: Any)[source]
cache_ok: bool | None = True
python_type()[source]
process_result_value(value, dialect)[source]
class IPNetwork(*args: Any, **kwargs: Any)[source]
cache_ok: bool | None = True
python_type()[source]
process_result_value(value, dialect)[source]
class MACAddress(*args: Any, **kwargs: Any)[source]
impl: TypeEngine[Any] | Type[TypeEngine[Any]] = String(length=10)
cache_ok: bool | None = True
load_dialect_impl(dialect)[source]
process_result_value(value, dialect)[source]
process_bind_param(value, dialect)[source]
python_type()[source]
class Money(*args: Any, **kwargs: Any)[source]
impl

alias of Integer

cache_ok: bool | None = True
python_type()[source]
static process_bind_param(value, dialect)[source]
static process_result_value(value, dialect)[source]
class TsTzRange(*args: Any, **kwargs: Any)[source]
impl

alias of TSTZRANGE

cache_ok: bool | None = True
python_type()[source]
process_literal_param(value, dialect)[source]
process_bind_param(value: Interval | None, dialect) str | None[source]

gets PY TYPE, returns DB TYPE

process_result_value(value: DateTimeTZRange | None, dialect) Interval | None[source]
class comparator_factory(expr: ColumnElement[_CT])[source]
contains(other: Any, **kwargs) None[source]

Provide the functionality of the @> operator for Intervals.

Parameters

other – can be an interval, a tz-aware datetime, or column-like sql expressions with these types.

If any .contains() call does not work, you can add support here.

overlaps(other: Any, **kwargs)[source]

Provide the functionality of the && operator for Intervals.

expr: ColumnElement[_CT]
type: TypeEngine[_CT]
exception InvalidMACAddressException[source]
class DateTimeTz[source]

A sqlalchemy type decorator corresponding to datetime types with time zone.

In other words, a mapped_column(DateTimeTz) produces python objects of type pycroft.helpers.utc.DateTimeTz.

pycroft.model.unix_account

This module contains the UnixAccount and UnixTombstone classes.

class UnixAccount(**kwargs)[source]
uid: Mapped[int]
tombstone: Mapped[UnixTombstone]
gid: Mapped[int]
login_shell: Mapped[str]
home_directory: Mapped[str]
id: Mapped[int]
class UnixTombstone(**kwargs)[source]

A tombstone for uids and logins, preventing their re-use.

A tombstone relates to a pycroft.model.user.User and UnixAccount via three relationships, as depicted in the ER diagram below. The hash is stored in the generated column User.login_hash, which has a foreign key on the Tombstone. Furthermore, the associated UnixAccount has a uid, which also points to a Tombstone.

There is a trigger which checks that if both of these objects exist, they point to the same tombstone.

--- title: Tombstone Consistency config: fontFamily: monospace --- erDiagram User { int unix_account_id str login str login_hash "generated always as digest(login, 'sha512')" } UnixAccount { int id int uid } UnixTombStone { int uid str login_hash } User |o--o| UnixAccount: "user_unix_account_id_fkey" User ||--o| UnixTombStone: "user_login_hash_fkey" UnixAccount ||--o| UnixTombStone: "unix_account_uid_fkey"

The lifecycle of a tombstone is restricted by check_unix_tombstone_lifecycle_func.

uid: Mapped[int]
login_hash: Mapped[bytes]
unix_account: Mapped[UnixAccount]
check_unix_tombstone_lifecycle_func = Function(name='check_unix_tombstone_lifecycle', arguments=[], rtype='trigger', volatility='stable', strict=False, leakproof=False, language='plpgsql', quote_tag='')

Trigger function ensuring proper tombstone lifecycle (see UnixTombstone).

--- title: Tombstone Lifecycle config: fontFamily: monospace --- stateDiagram-v2 s0: ¬uid, ¬login_hash s1: uid, ¬login_hash s2: ¬uid, login_hash s3: uid, login_hash s0 --> s1: set uid s0 --> s2: set login_hash s1 --> s1 s2 --> s2 s1 --> s3: set login_hash s2 --> s3: set uid s3 --> s3
unix_account_ensure_tombstone_func = Function(name='unix_account_ensure_tombstone', arguments=[], rtype='trigger', volatility='volatile', strict=True, leakproof=False, language='plpgsql', quote_tag='')

Trigger function ensuring automatic generation of a tombstone on UnixAccount inserts.

ensure_tombstone = Function(name='user_ensure_tombstone', arguments=[], rtype='trigger', volatility='volatile', strict=True, leakproof=False, language='plpgsql', quote_tag='')

Trigger function ensuring automatic generation of a tombstone on User updates.

check_tombstone_consistency = Function(name='check_tombstone_consistency', arguments=[], rtype='trigger', volatility='volatile', strict=True, leakproof=False, language='plpgsql', quote_tag='')

Trigger function checking whether User and associated UnixAccount refer to the same Tombstone.

See Tombstone for an illustration.

pycroft.model.user

This module contains the class User.

copyright
  1. 2011 by AG DSN.

exception IllegalLoginError[source]
exception IllegalEmailError[source]
class BaseUser(*args, **kwargs)[source]
login: Mapped[str40] = <sqlalchemy.orm.properties.MappedColumn object>
login_hash: Mapped[bytes] = Column(None, LargeBinary(length=512), table=None, server_default=Computed(<sqlalchemy.sql.elements.TextClause object>))

Auto-generated sha512 hash of login.

name: Mapped[str255]
registered_at: Mapped[utc.DateTimeTz]
passwd_hash: Mapped[str_deferred | None]
email: Mapped[str255 | None]
email_confirmed: Mapped[bool] = <sqlalchemy.orm.properties.MappedColumn object>
email_confirmation_key: Mapped[str | None]
birthdate: Mapped[date | None]
swdd_person_id: Mapped[int | None]
room_id: Mapped[room_fk | None]
login_regex = re.compile('\n            ^\n            # Must begin with a lowercase character\n            [a-z]\n            # Can continue with lowercase characters, numbers and some punctuation\n            # but between , re.VERBOSE)
login_regex_ci = re.compile('\n            ^\n            # Must begin with a lowercase character\n            [a-z]\n            # Can continue with lowercase characters, numbers and some punctuation\n            # but between , re.IGNORECASE|re.VERBOSE)
email_regex = re.compile("^[a-zA-Z0-9!#$%&'*+\\-/=?^_`{|}~]+(?:\\.[a-zA-Z0-9!#$%&'*+\\-/=?^_`{|}~]+)*@(?:[a-zA-Z0-9]+(?:\\.|-))+[a-zA-Z]+$")
blocked_logins = {'abuse', 'admin', 'administrator', 'anonymous', 'api', 'autoconfig', 'backup', 'bacula', 'bb', 'bin', 'broadcasthost', 'contact', 'daemon', 'email', 'ftp', 'ftpadmin', 'games', 'git', 'guest', 'help', 'hostmaster', 'imap', 'info', 'is', 'isatap', 'it', 'localdomain', 'localhost', 'log', 'login', 'lp', 'mail', 'mailer-daemon', 'majordom', 'man', 'marketing', 'mis', 'msql', 'name', 'news', 'no-reply', 'nobody', 'noc', 'noreply', 'operator', 'pop', 'pop3', 'postgres', 'postmaster', 'privacy', 'proxy', 'root', 'sales', 'smtp', 'ssladmin', 'ssladministrator', 'sslwebmaster', 'status', 'support', 'sync', 'sys', 'sysadmin', 'usenet', 'user', 'username', 'uucp', 'web', 'webmaster', 'website', 'wpad', 'www', 'www-data', 'wwwadmin'}
login_character_limit = 22
validate_login(_, value)[source]
validate_email(_, value)[source]
validate_passwd_hash(_, value)[source]
check_password(plaintext_password: str) bool[source]

verify a given plaintext password against the users passwd hash.

property password: str

Store a hash of a given plaintext passwd for the user.

class User(**kwargs)[source]
wifi_passwd_hash: Mapped[str | None]
account_id: Mapped[int]
account: Mapped[Account]
tombstone: Mapped[UnixTombstone]
unix_account_id: Mapped[int | None]
unix_account: Mapped[UnixAccount]
address_id: Mapped[int]
address: Mapped[Address]
room: Mapped[Room | None]
email_forwarded: Mapped[bool]
password_reset_token: Mapped[str | None]
memberships: Mapped[list[Membership]]
room_history_entries: Mapped[list[RoomHistoryEntry]]
hosts: Mapped[list[Host]]
mpsk_clients: Mapped[list[MPSKClient]]
authored_log_entries: Mapped[list[LogEntry]]
log_entries: Mapped[list[UserLogEntry]]
task_log_entries: Mapped[list[TaskLogEntry]]
tenancies: Mapped[list[Tenancy]]
tasks: Mapped[list[UserTask]]
traffic_volumes: Mapped[list[TrafficVolume]]
has_custom_address

Whether the user’s address differs from their room’s address.

If no room is assigned, returns False.

validate_login(_, value)[source]
property_groups
traffic_for_days(days)[source]
current_properties: Mapped[list[CurrentProperty]]

This is a relationship to the current_property view filtering out the entries with denied=True.

current_properties_maybe_denied: Mapped[list[CurrentProperty]]

This is a relationship to the current_property view ignoring the denied attribute.

property current_properties_set: set[str]

A type-agnostic property giving the granted properties as a set of string.

Utilized in the web component’s access control mechanism.

property latest_log_entry: UserLogEntry | None
property wifi_password: str | None

return the cleartext wifi password (without crypt prefix) if available.

Returns

None if the wifi_passwd_hash is not set or is not cleartext.

has_wifi_access
static verify_and_get(login: str, plaintext_password: str) User | None[source]
current_memberships
active_memberships(when=None)[source]
active_property_groups(when=None)[source]
member_of(group: PropertyGroup, when: Interval | None = None) bool[source]
has_property(property_name: str, when: datetime | None = None) bool[source]
property permission_level: int
property email_internal
birthdate: Mapped[date | None]
email: Mapped[str255 | None]
email_confirmation_key: Mapped[str | None]
email_confirmed: Mapped[bool]
id: Mapped[int]
login: Mapped[str40]
login_hash: Mapped[bytes]

Auto-generated sha512 hash of login.

name: Mapped[str255]
passwd_hash: Mapped[str_deferred | None]
registered_at: Mapped[utc.DateTimeTz]
room_id: Mapped[room_fk | None]
swdd_person_id: Mapped[int | None]
create_pgcrypto(target, connection, **kw)[source]
class Group(**kwargs)[source]
name: Mapped[str255]
discriminator: Mapped[str]
users: Mapped[list[User]]
memberships: Mapped[list[Membership]]
active_users(when=None)[source]
Parameters

when (Interval)

Return type

list[User]

id: Mapped[int]
class Membership(**kwargs)[source]
active_during: Mapped[Interval[utc.DateTimeTz]]
disable(at=None)[source]
group_id: Mapped[int]
group: Mapped[Group]
user_id: Mapped[int]
user: Mapped[User]
id: Mapped[int]
create_btree_gist(target, connection, **kw)[source]
class PropertyGroup(**kwargs)[source]
id: Mapped[int]
permission_level: Mapped[int]
property_grants = ColumnAssociationProxyInstance(AssociationProxy('properties', 'granted'))
properties: Mapped[dict[str, Property]]
discriminator: Mapped[str]
memberships: Mapped[list[Membership]]
name: Mapped[str255]
users: Mapped[list[User]]
class Property(**kwargs)[source]
name: Mapped[str255]
granted: Mapped[bool]
property_group_id: Mapped[int]
property_group: Mapped[PropertyGroup]
id: Mapped[int]
class RoomHistoryEntry(**kwargs)[source]
active_during: Mapped[Interval[utc.DateTimeTz]]
disable(at=None)[source]
room_id: Mapped[int]
room: Mapped[Room]
user_id: Mapped[int]
user: Mapped[User]
id: Mapped[int]
class PreMember(**kwargs)[source]
login: Mapped[str40]
email: Mapped[str255 | None]
email_confirmation_key: Mapped[str | None]
email_confirmed: Mapped[bool]
id: Mapped[int]
login_hash: Mapped[bytes]

Auto-generated sha512 hash of login.

move_in_date: Mapped[date | None]
name: Mapped[str255]
registered_at: Mapped[utc.DateTimeTz]
room_id: Mapped[room_fk | None]
swdd_person_id: Mapped[int | None]
previous_dorm: Mapped[str | None]
birthdate: Mapped[date]
passwd_hash: Mapped[str]
room: Mapped[Room]
tenancies: Mapped[list[Tenancy]]
property is_adult: bool

pycroft.model.webstorage

class WebStorage(**kwargs)[source]
data: Mapped[bytes]
expiry: Mapped[DateTimeTz]
static auto_expire() None[source]

Delete all expired items from the database

id: Mapped[int]