infrahub_sdk.node.node
Classes
InfrahubNode
Represents a Infrahub node in an asynchronous context.
Methods:
from_graphql
from_graphql(cls, client: InfrahubClient, branch: str, data: dict, schema: MainSchemaTypesAPI | None = None, timeout: int | None = None) -> Self
generate
generate(self, nodes: list[str] | None = None) -> None
artifact_generate
artifact_generate(self, name: str) -> None
artifact_fetch
artifact_fetch(self, name: str) -> str | dict[str, Any]
download_file
download_file(self, dest: None = None, skip_if_unchanged: bool = ...) -> bytes
Show 2 other overloads
download_file
download_file(self, dest: Path, skip_if_unchanged: bool = ...) -> int
download_file
download_file(self, dest: Path | None = None, skip_if_unchanged: bool = False) -> bytes | int
Download the file content from this FileObject node.
This method is only available for nodes that inherit from CoreFileObject. The node must have been saved (have an id) before calling this method.
Args:
dest: Optional destination path. If provided, the file will be streamed directly to this path (memory-efficient for large files) and the number of bytes written will be returned. If not provided, the file content will be returned as bytes.skip_if_unchanged: WhenTrue, compute the SHA-1 of the file atdest(which must be provided) and compare against the node'schecksumattribute. If they match, return0without hitting the network. Thechecksumis the value loaded when this node was fetched — a later server-side change to the file will not be detected unless the caller re-fetches the node first.
Returns:
- If
destis None: The file content as bytes. - If
destis provided: The number of bytes written to the file. - If
skip_if_unchanged=Trueand the local file matches the server 0.
Raises:
FeatureNotSupportedError: If this node doesn't inherit from CoreFileObject.ValueError: If the node hasn't been saved yet, file not found, orskip_if_unchanged=Truewas passed without adest.AuthenticationError: If authentication fails.
Examples:
>>> # Download to memory
>>> content = await contract.download_file()
>>> # Stream to file (memory-efficient for large files)
>>> bytes_written = await contract.download_file(dest=Path("/tmp/contract.pdf"))
>>> # Skip download if local file already matches server checksum
>>> bytes_written = await contract.download_file(
... dest=Path("/tmp/contract.pdf"), skip_if_unchanged=True
... )
matches_local_checksum
matches_local_checksum(self, source: bytes | Path | BinaryIO) -> bool
Return True if source's SHA-1 matches this node's server checksum.
Only available for nodes inheriting from CoreFileObject. Callers
that want to branch on the comparison without invoking a transfer
should use this primitive instead of reading node.checksum.value
and hashing source themselves, so the hashing convention stays
centralised in the SDK.
The comparison is against the checksum attribute as loaded
when this node was retrieved from the server. If the server's
file has been replaced since the node was fetched, this method
will not see that change — re-fetch the node to refresh the
checksum before comparing.
Args:
source: Local content to hash and compare. Accepts the same shapes as :func:infrahub_sdk.file_handler.sha1_of_source.
Returns:
- True if the local digest equals the server's stored checksum.
Raises:
FeatureNotSupportedError: Node is not aCoreFileObject.ValueError: Node has no server-side checksum yet (unsaved or file never attached).
upload_if_changed
upload_if_changed(self, source: bytes | Path | BinaryIO, name: str | None = None) -> UploadResult
Upload source only if its SHA-1 differs from the server checksum.
Composes :meth:matches_local_checksum with :meth:upload_from_path
(or :meth:upload_from_bytes) and :meth:save. For unsaved nodes or
nodes that have no prior server-side file, the upload is always
performed — there is nothing to compare against.
Args:
source: Content to upload.bytesandBinaryIOsources must supplyname; for aPaththe filename is derived fromsource.namewhennameis omitted.name: Filename to use on the server. Required forbytes/BinaryIOsources.
Returns:
- class:
UploadResultwithwas_uploaded=False(skipped) or was_uploaded=True(transfer occurred), and the resulting server- checksum (
Noneonly when no server checksum was available - after the operation).
Raises:
FeatureNotSupportedError: Node is not aCoreFileObject.ValueError:sourceisbytesorBinaryIOand nonamewas supplied.
delete
delete(self, timeout: int | None = None, request_context: RequestContext | None = None) -> None
save
save(self, allow_upsert: bool = False, update_group_context: bool | None = None, timeout: int | None = None, request_context: RequestContext | None = None) -> None
generate_query_data
generate_query_data(self, filters: dict[str, Any] | None = None, offset: int | None = None, limit: int | None = None, include: list[str] | None = None, exclude: list[str] | None = None, fragment: bool = False, prefetch_relationships: bool = False, partial_match: bool = False, property: bool = False, order: Order | None = None, include_metadata: bool = False) -> dict[str, Any | dict]
generate_query_data_node
generate_query_data_node(self, include: list[str] | None = None, exclude: list[str] | None = None, inherited: bool = True, insert_alias: bool = False, prefetch_relationships: bool = False, property: bool = False, include_metadata: bool = False) -> dict[str, Any | dict]
Generate the node part of a GraphQL Query with attributes and nodes.
Args:
include: List of attributes or relationships to include. Defaults to None.exclude: List of attributes or relationships to exclude. Defaults to None.inherited: Indicated of the attributes and the relationships inherited from generics should be included as well. Defaults to True.insert_alias: If True, inserts aliases in the query for each attribute or relationship.prefetch_relationships: If True, pre-fetches relationship data as part of the query.include_metadata: If True, includes node_metadata and relationship_metadata in the query.
Returns:
- dict[str, Union[Any, Dict]]: GraphQL query in dictionary format
add_relationships
add_relationships(self, relation_to_update: str, related_nodes: list[str]) -> None
remove_relationships
remove_relationships(self, relation_to_update: str, related_nodes: list[str]) -> None
create
create(self, allow_upsert: bool = False, timeout: int | None = None, request_context: RequestContext | None = None) -> None
update
update(self, do_full_update: bool = False, timeout: int | None = None, request_context: RequestContext | None = None) -> None
get_pool_allocated_resources
get_pool_allocated_resources(self, resource: InfrahubNode) -> list[InfrahubNode]
Fetch all nodes that were allocated for the pool and a given resource.
Args:
resource: The resource from which the nodes were allocated.
Returns:
- list[InfrahubNode]: The allocated nodes.
get_pool_resources_utilization
get_pool_resources_utilization(self) -> list[dict[str, Any]]
Fetch the utilization of each resource for the pool.
Returns:
- list[dict[str, Any]]: A list containing the allocation numbers for each resource of the pool.
get_flat_value
get_flat_value(self, key: str, separator: str = '__') -> Any
Query recursively a value defined in a flat notation (string), on a hierarchy of objects
Examples:
name__value module.object.value
extract
extract(self, params: dict[str, str]) -> dict[str, Any]
Extract some data points defined in a flat notation.
InfrahubNodeSync
Represents a Infrahub node in a synchronous context.
Methods:
from_graphql
from_graphql(cls, client: InfrahubClientSync, branch: str, data: dict, schema: MainSchemaTypesAPI | None = None, timeout: int | None = None) -> Self
generate
generate(self, nodes: list[str] | None = None) -> None
artifact_generate
artifact_generate(self, name: str) -> None
artifact_fetch
artifact_fetch(self, name: str) -> str | dict[str, Any]
download_file
download_file(self, dest: None = None, skip_if_unchanged: bool = ...) -> bytes
Show 2 other overloads
download_file
download_file(self, dest: Path, skip_if_unchanged: bool = ...) -> int
download_file
download_file(self, dest: Path | None = None, skip_if_unchanged: bool = False) -> bytes | int
Download the file content from this FileObject node.
This method is only available for nodes that inherit from CoreFileObject. The node must have been saved (have an id) before calling this method.
Args:
dest: Optional destination path. If provided, the file will be streamed directly to this path (memory-efficient for large files) and the number of bytes written will be returned. If not provided, the file content will be returned as bytes.skip_if_unchanged: WhenTrue, compute the SHA-1 of the file atdest(which must be provided) and compare against the node'schecksumattribute. If they match, return0without hitting the network. Thechecksumis the value loaded when this node was fetched — a later server-side change to the file will not be detected unless the caller re-fetches the node first.
Returns:
- If
destis None: The file content as bytes. - If
destis provided: The number of bytes written to the file. - If
skip_if_unchanged=Trueand the local file matches the server 0.
Raises:
FeatureNotSupportedError: If this node doesn't inherit from CoreFileObject.ValueError: If the node hasn't been saved yet, file not found, orskip_if_unchanged=Truewas passed without adest.AuthenticationError: If authentication fails.
Examples:
>>> # Download to memory
>>> content = contract.download_file()
>>> # Stream to file (memory-efficient for large files)
>>> bytes_written = contract.download_file(dest=Path("/tmp/contract.pdf"))
>>> # Skip download if local file already matches server checksum
>>> bytes_written = contract.download_file(
... dest=Path("/tmp/contract.pdf"), skip_if_unchanged=True
... )
matches_local_checksum
matches_local_checksum(self, source: bytes | Path | BinaryIO) -> bool
Return True if source's SHA-1 matches this node's server checksum.
Sync equivalent of :meth:InfrahubNode.matches_local_checksum. See
that method for full documentation.
Args:
source: Local content to hash and compare. Accepts the same shapes as :func:infrahub_sdk.file_handler.sha1_of_source.
Returns:
- True if the local digest equals the server's stored checksum.
Raises:
FeatureNotSupportedError: Node is not aCoreFileObject.ValueError: Node has no server-side checksum yet.
upload_if_changed
upload_if_changed(self, source: bytes | Path | BinaryIO, name: str | None = None) -> UploadResult
Upload source only if its SHA-1 differs from the server checksum.
Sync equivalent of :meth:InfrahubNode.upload_if_changed. See that
method for full documentation.
Args:
source: Content to upload.name: Filename to use on the server.
Returns:
- class:
UploadResult.
Raises:
FeatureNotSupportedError: Node is not aCoreFileObject.ValueError: Bytes/BinaryIO source withoutname.
delete
delete(self, timeout: int | None = None, request_context: RequestContext | None = None) -> None
save
save(self, allow_upsert: bool = False, update_group_context: bool | None = None, timeout: int | None = None, request_context: RequestContext | None = None) -> None
generate_query_data
generate_query_data(self, filters: dict[str, Any] | None = None, offset: int | None = None, limit: int | None = None, include: list[str] | None = None, exclude: list[str] | None = None, fragment: bool = False, prefetch_relationships: bool = False, partial_match: bool = False, property: bool = False, order: Order | None = None, include_metadata: bool = False) -> dict[str, Any | dict]
generate_query_data_node
generate_query_data_node(self, include: list[str] | None = None, exclude: list[str] | None = None, inherited: bool = True, insert_alias: bool = False, prefetch_relationships: bool = False, property: bool = False, include_metadata: bool = False) -> dict[str, Any | dict]
Generate the node part of a GraphQL Query with attributes and nodes.
Args:
include: List of attributes or relationships to include. Defaults to None.exclude: List of attributes or relationships to exclude. Defaults to None.inherited: Indicated of the attributes and the relationships inherited from generics should be included as well. Defaults to True.insert_alias: If True, inserts aliases in the query for each attribute or relationship.prefetch_relationships: If True, pre-fetches relationship data as part of the query.include_metadata: If True, includes node_metadata and relationship_metadata in the query.
Returns:
- dict[str, Union[Any, Dict]]: GraphQL query in dictionary format
add_relationships
add_relationships(self, relation_to_update: str, related_nodes: list[str]) -> None
remove_relationships
remove_relationships(self, relation_to_update: str, related_nodes: list[str]) -> None
create
create(self, allow_upsert: bool = False, timeout: int | None = None, request_context: RequestContext | None = None) -> None
update
update(self, do_full_update: bool = False, timeout: int | None = None, request_context: RequestContext | None = None) -> None
get_pool_allocated_resources
get_pool_allocated_resources(self, resource: InfrahubNodeSync) -> list[InfrahubNodeSync]
Fetch all nodes that were allocated for the pool and a given resource.
Args:
resource: The resource from which the nodes were allocated.
Returns:
- list[InfrahubNodeSync]: The allocated nodes.
get_pool_resources_utilization
get_pool_resources_utilization(self) -> list[dict[str, Any]]
Fetch the utilization of each resource for the pool.
Returns:
- list[dict[str, Any]]: A list containing the allocation numbers for each resource of the pool.
get_flat_value
get_flat_value(self, key: str, separator: str = '__') -> Any
Query recursively a value defined in a flat notation (string), on a hierarchy of objects
Examples:
name__value module.object.value
extract
extract(self, params: dict[str, str]) -> dict[str, Any]
Extract some data points defined in a flat notation.
UploadResult
Outcome of an idempotent upload attempt.
Returned by :meth:InfrahubNode.upload_if_changed and its sync twin.
was_uploaded tells the caller whether a network transfer actually
happened; checksum carries the SHA-1 of the content held on the
server after the operation — on skip paths that is the server's
pre-existing value, on upload paths it is the locally-computed SHA-1
used as a proxy (which matches what a standard CoreFileObject server
stores, since the server computes SHA-1 of received bytes). None
only when no server checksum was available (either the node was
unsaved and nothing was transferred, or the save returned no checksum
value).
The comparison used by upload_if_changed reads the node's
checksum attribute, which was populated when the node was
fetched via client.get(...). A server-side change to the file
between the fetch and the call will not be detected unless the
caller re-fetches the node first.
InfrahubNodeBase
Base class for InfrahubNode and InfrahubNodeSync
Methods:
get_branch
get_branch(self) -> str
get_path_value
get_path_value(self, path: str) -> Any
get_human_friendly_id
get_human_friendly_id(self) -> list[str] | None
get_human_friendly_id_as_string
get_human_friendly_id_as_string(self, include_kind: bool = False) -> str | None
hfid
hfid(self) -> list[str] | None
hfid_str
hfid_str(self) -> str | None
get_node_metadata
get_node_metadata(self) -> NodeMetadata | None
Returns the node metadata (created_at, created_by, updated_at, updated_by) if fetched.
get_kind
get_kind(self) -> str
get_all_kinds
get_all_kinds(self) -> list[str]
is_ip_prefix
is_ip_prefix(self) -> bool
is_ip_address
is_ip_address(self) -> bool
is_resource_pool
is_resource_pool(self) -> bool
is_file_object
is_file_object(self) -> bool
Check if this node inherits from CoreFileObject and supports file uploads.
upload_from_path
upload_from_path(self, path: Path) -> None
Set a file from disk to be uploaded when saving this FileObject node.
The file will be streamed during upload, avoiding loading the entire file into memory.
Args:
path: Path to the file on disk.
Raises:
FeatureNotSupportedError: If this node doesn't inherit from CoreFileObject.
upload_from_bytes
upload_from_bytes(self, content: bytes | BinaryIO, name: str) -> None
Set content to be uploaded when saving this FileObject node.
The content can be provided as bytes or a file-like object. Using BinaryIO is recommended for large content to stream during upload.
Args:
content: The file content as bytes or a file-like object.name: The filename to use for the uploaded file.
Raises:
FeatureNotSupportedError: If this node doesn't inherit from CoreFileObject.
Examples:
>>> # Using bytes (for small files)
>>> node.upload_from_bytes(content=b"file content", name="example.txt")
>>> # Using file-like object (for large files)
>>> with open("/path/to/file.bin", "rb") as f:
... node.upload_from_bytes(content=f, name="file.bin")
clear_file
clear_file(self) -> None
Clear any pending file content.
get_raw_graphql_data
get_raw_graphql_data(self) -> dict | None
generate_query_data_init
generate_query_data_init(self, filters: dict[str, Any] | None = None, offset: int | None = None, limit: int | None = None, include: list[str] | None = None, exclude: list[str] | None = None, partial_match: bool = False, order: Order | None = None, include_metadata: bool = False) -> dict[str, Any | dict]