pypath.resources.controller.ResourceController§

class pypath.resources.controller.ResourceController(resource_info_path=(PosixPath('/home/runner/work/pypath/pypath/pypath'), 'resources', 'data', 'resources.json'), use_package_path=False)[source]§

Bases: Logger

Resource controller is aimed to be the central part of pypath communication with resources.

14.01.2020: the initial step for resource controller development:

used for /info page generating for the server.

14.02.2020: storing and reading enzyme-substrate resource definitions

from the JSON; class inherits from session.Logger

__init__(resource_info_path=(PosixPath('/home/runner/work/pypath/pypath/pypath'), 'resources', 'data', 'resources.json'), use_package_path=False)[source]§

Make this instance a logger.

Parameters:
  • name – The label of this instance that will be prepended to all messages it sends to the logger.

  • module – Send the messages by the logger of this module.

Methods

__init__([resource_info_path, use_package_path])

Make this instance a logger.

add_resource_attrs(resources)

Adds resource attributes to a list of resources.

collect(data_type)

collect_enzyme_substrate()

collect_interaction([datasets, ...])

Collect network (interaction) resource definitions.

collect_network([datasets, ...])

Collect network (interaction) resource definitions.

license(name)

license_filter(resources[, purpose, ...])

Filters a list of resources by their license.

name(name)

reload()

resource(name)

secondary_resources(name[, postfix])

param name:

Name of a composite resource.

update([path, force, remove_old])

Reads resource information from a JSON file.

update_licenses()

add_resource_attrs(resources: dict | Iterable[AbstractResource]) None[source]§

Adds resource attributes to a list of resources.

It modifies the instances in-place, returns nothing.

collect_interaction(datasets: Iterable[Literal['pathway', 'pathway_noref', 'pathway_all', 'activity_flow', 'mirna_target', 'dorothea', 'tfregulons', 'omnipath', 'reaction_pc', 'enzyme_substrate', 'extra_directions', 'small_molecule_protein', 'tf_mirna', 'pathwaycommons', 'pathwaycommons_transcription', 'interaction', 'interaction_htp', 'interaction_misc', 'ligand_receptor', 'lncrna_target', 'transcription_onebyone', 'transcription_dorothea', 'ptm', 'ptm_noref', 'ptm_all', 'reaction', 'reaction_misc', 'negative']] | None = 'pathway', interaction_types: Iterable[Literal['post_translational', 'transcriptional', 'small_molecule_protein', 'post_transcriptional']] | None = 'post_translational', data_models: Iterable[Literal['activity_flow', 'interaction', 'enzyme_substrate', 'process_description', 'ligand_receptor', 'drug_target']] | None = 'activity_flow', license_purpose: Literal['academic', 'commercial', 'for-profit', 'non-profit', 'ignore'] = 'ignore', license_sharing: Literal['alike', 'free', 'noderiv', 'noshare', 'share', 'deriv', 'ignore'] = 'ignore', license_attrib: Literal['attrib', 'free', 'noattrib', 'composite', 'ignore'] = 'ignore', **kwargs) dict§

Collect network (interaction) resource definitions.

Parameters:
  • interaction_types – Include only these interaction types.

  • data_models – Inclde only these data models.

  • datasets – Process only these datasets. Note: there are many synonyms and overlaps among datasets. In addition, the overlaps might apply slightly different settings for the same resource, e.g. in pathway, interactions must have literature references, while in pathway_noref the same resources might allow interactions without references. The safest is to process only one dataset at a time and load them into the Network object sequentially.

  • license_purpose – Do not include the resources that are not legally compatible with the defined purpose.

  • license_sharing – Include only resources that allow the desired redistribution conditions. E.g. “deriv” means that the resources must allow the sharing of their derivative (altered) versions.

  • license_attrib – Include only resources that allow the desired level of attribution. E.g. “noattrib” means that you can use the resource without even mentioning who created it.

  • kwargs – Custom filters. Names should be attributes of the resource or the NetworkInput object. The special key __resource__ can be used to refer to the whole NetworkResource object. For simple values, the test is equality, for arrays incidence, while custom callables can be provided for more flexible filters.

collect_network(datasets: Iterable[Literal['pathway', 'pathway_noref', 'pathway_all', 'activity_flow', 'mirna_target', 'dorothea', 'tfregulons', 'omnipath', 'reaction_pc', 'enzyme_substrate', 'extra_directions', 'small_molecule_protein', 'tf_mirna', 'pathwaycommons', 'pathwaycommons_transcription', 'interaction', 'interaction_htp', 'interaction_misc', 'ligand_receptor', 'lncrna_target', 'transcription_onebyone', 'transcription_dorothea', 'ptm', 'ptm_noref', 'ptm_all', 'reaction', 'reaction_misc', 'negative']] | None = 'pathway', interaction_types: Iterable[Literal['post_translational', 'transcriptional', 'small_molecule_protein', 'post_transcriptional']] | None = 'post_translational', data_models: Iterable[Literal['activity_flow', 'interaction', 'enzyme_substrate', 'process_description', 'ligand_receptor', 'drug_target']] | None = 'activity_flow', license_purpose: Literal['academic', 'commercial', 'for-profit', 'non-profit', 'ignore'] = 'ignore', license_sharing: Literal['alike', 'free', 'noderiv', 'noshare', 'share', 'deriv', 'ignore'] = 'ignore', license_attrib: Literal['attrib', 'free', 'noattrib', 'composite', 'ignore'] = 'ignore', **kwargs) dict[source]§

Collect network (interaction) resource definitions.

Parameters:
  • interaction_types – Include only these interaction types.

  • data_models – Inclde only these data models.

  • datasets – Process only these datasets. Note: there are many synonyms and overlaps among datasets. In addition, the overlaps might apply slightly different settings for the same resource, e.g. in pathway, interactions must have literature references, while in pathway_noref the same resources might allow interactions without references. The safest is to process only one dataset at a time and load them into the Network object sequentially.

  • license_purpose – Do not include the resources that are not legally compatible with the defined purpose.

  • license_sharing – Include only resources that allow the desired redistribution conditions. E.g. “deriv” means that the resources must allow the sharing of their derivative (altered) versions.

  • license_attrib – Include only resources that allow the desired level of attribution. E.g. “noattrib” means that you can use the resource without even mentioning who created it.

  • kwargs – Custom filters. Names should be attributes of the resource or the NetworkInput object. The special key __resource__ can be used to refer to the whole NetworkResource object. For simple values, the test is equality, for arrays incidence, while custom callables can be provided for more flexible filters.

license_filter(resources: list | dict, purpose: Literal['academic', 'commercial', 'for-profit', 'non-profit', 'ignore'] | None = None, sharing: Literal['alike', 'free', 'noderiv', 'noshare', 'share', 'deriv', 'ignore'] | None = None, attrib: Literal['attrib', 'free', 'noattrib', 'composite', 'ignore'] | None = None) list | dict[source]§

Filters a list of resources by their license.

secondary_resources(name, postfix=False)[source]§
Parameters:
  • name – Name of a composite resource.

  • postfix – Append the name of the primary resource to the secondary, separated by an underscore, e.g. “TFactS_CollecTRI”.

update(path=None, force=False, remove_old=False)[source]§

Reads resource information from a JSON file.

Parameters:
  • path (str,NoneType) – Path to a JSON file with resource information. By default the path in py:attr:resource_info_path used which points by default to the built in resource information file.

  • force (bool) – Read the file again even if no new path provided and it has been read already.

  • remove_old (bool) – Remove old data before reading. By default the data will be updated with the contents of the new file potentially overwriting identical keys in the old data.