pytimber.SparkLoggingDB

class pytimber.SparkLoggingDB(appid: str = 'SPARKPYTIMBER4', clientid: str = 'PYTIMBER3', source: str = 'nxcals', loglevel: str | int | None = None, sparkconf: str | None = None, sparkprops: Mapping[str, str] | None = None, kerberosprincipal: str | None = None, kerberoskeytab: str | None = None, data_location: str = 'pro', spark_session: SparkSession | None = None, service_url: str = 'https://cs-ccr-nxcals6.cern.ch:19093,https://cs-ccr-nxcals7.cern.ch:19093,https://cs-ccr-nxcals8.cern.ch:19093', sparkloglevel: str | None = 'ERROR')

Methods

SparkLoggingDB.__init__([appid, clientid, ...])

Initializes a new instance of the PyTimber client.

SparkLoggingDB.get(pattern_or_list, t1[, ...])

Query for a list of variables or for variables whose name matches a pattern (string) in a time window from t1 to t2 and return corresponding variable(s) data.

SparkLoggingDB.getAligned(pattern_or_list, ...)

Use get aligned instead

SparkLoggingDB.getChildrenForHierarchies(...)

Use get_children_for_hierarchies instead

SparkLoggingDB.getDataUsingSnapshots(...[, ...])

Use get_data_using_snapshots instead

SparkLoggingDB.getDescription(pattern)

Use get_variable_description instead

SparkLoggingDB.getFundamentals(t1, t2, ...)

Use get_fundamentals instead

SparkLoggingDB.getHierarchiesForVariables(...)

Use get_hierarchies_for_variables instead

SparkLoggingDB.getIntervalsByLHCModes(t1, ...)

Use get_interval_by_lhc_modes instead

SparkLoggingDB.getLHCFillData([fill_number, ...])

Use get_lhc_fill_data instead

SparkLoggingDB.getLHCFillsByTime(t1, t2[, ...])

Use get_lhc_fills_by_time instead

SparkLoggingDB.getMetaData(pattern_or_list)

Use get_meta_data instead

SparkLoggingDB.getScaled(pattern_or_list, t1, t2)

Use get_scaled instead

SparkLoggingDB.getSnapshotNames(pattern_or_list)

Use get_snapshot_names instead

SparkLoggingDB.getStats(pattern_or_list, t1, t2)

Use get_variable_stats instead

SparkLoggingDB.getUnit(pattern)

Use get_variable_unit instead

SparkLoggingDB.getVariable(variable, t1[, ...])

Use get_variable instead

SparkLoggingDB.getVariablesForHierarchies(...)

Use get_variables_for_hierarchies instead

SparkLoggingDB.getVariablesForSnapshots(...)

Use get_variables_for_snapshots instead

SparkLoggingDB.getVariablesOrigin(...)

Use get_variable_origin instead

SparkLoggingDB.get_aligned(pattern, t1, t2)

Retrieves the aligned data for the given pattern of variables within the specified time range [t1, t2], based on the fundamental and master variable.

SparkLoggingDB.get_as_pivot(pattern_or_list, t1)

SparkLoggingDB.get_children_for_hierarchies(...)

Get a list of children hierarchy paths for given parent hierarchy paths.

SparkLoggingDB.get_data_using_snapshots(...)

Get data for variables attached to snapshots which are selected by provided list of strings or pattern in a time window defined in the snapshot configuration

SparkLoggingDB.get_fundamentals(t1, t2, pattern)

Gets list of FundamentalVariable objects between two timestamps for a given pattern.

SparkLoggingDB.get_hierarchies_for_variables(...)

Get a list of hierarchy paths for variables (in which given variables are defined).

SparkLoggingDB.get_interval_by_lhc_modes(t1, ...)

Returns a list of FillInterval objects for the given time window and two LHC beam modes

SparkLoggingDB.get_lhc_fill_data([fill_number])

Gets times and beam modes for a particular LHC fill.

SparkLoggingDB.get_lhc_fills_by_time(...[, ...])

Returns a list of LHC fills data that occurred within the specified time range.

SparkLoggingDB.get_meta_data(pattern)

Retrieves metadata for vector variables (so called vector contexts) matching a pattern.

SparkLoggingDB.get_scaled(pattern, t1, t2[, ...])

Retrieves the scaled data for the given pattern of variables in a time window from t1 to t2.

SparkLoggingDB.get_snapshot_names(...[, ...])

Get a list of snapshots names based on a list of strings or a pattern, filtered by owner pattern and description pattern.

SparkLoggingDB.get_variable(variable, t1[, ...])

Query for a specific variable within a time window and with the option to filter by fundamentals and return corresponding variable data.

SparkLoggingDB.get_variable_description(pattern)

Gets description(s) for the given variable pattern.

SparkLoggingDB.get_variable_origin(pattern)

Gets the system origin(s) for the given variable pattern.

SparkLoggingDB.get_variable_stats(pattern, ...)

Gets variable statistics for the specified time range.

SparkLoggingDB.get_variable_unit(pattern)

Gets unit(s) for the given variable pattern.

SparkLoggingDB.get_variables_for_hierarchies(...)

Get a list of variables attached to hierarchies.

SparkLoggingDB.get_variables_for_snapshots(...)

Get variables attached to snapshots.

SparkLoggingDB.search(pattern)

Use search_variables instead

SparkLoggingDB.searchFundamental(fundamental, t1)

Use search_fundamentals instead

SparkLoggingDB.search_fundamentals(...[, t2])

Searches for fundamental variables names that match the specified pattern within the specified time range.

SparkLoggingDB.search_variables(pattern)

Searches for variable names that match the given pattern.