module documentation

A collection of CLI commands for working with Kedro micro-packages.

Function micropkg Commands for working with micro-packages.
Function micropkg_cli Undocumented
Function package_micropkg Package up a modular pipeline or micro-package as a Python source distribution.
Function pull_package Pull and unpack a modular pipeline and other micro-packages in your project.
Function safe_extract Undocumented
Function _append_package_reqs Appends micro-package requirements to project level requirements.txt
Function _check_module_path Undocumented
Function _create_nested_package Undocumented
Function _find_config_files Undocumented
Function _generate_manifest_file Undocumented
Function _generate_sdist_file Undocumented
Function _generate_setup_file Undocumented
Function _get_default_version Undocumented
Function _get_fsspec_filesystem Undocumented
Function _get_package_artifacts From existing package, returns in order: source_path, tests_path, config_path
Function _get_sdist_name Undocumented
Function _install_files Undocumented
Function _is_within_directory Undocumented
Function _make_install_requires Parses each line of requirements.txt into a version specifier valid to put in install_requires.
Function _move_package Move a Python package, refactoring relevant imports along the way. A target of empty string means moving to the root of the project.
Function _package_micropkg Undocumented
Function _package_micropkgs_from_manifest Undocumented
Function _pull_package Undocumented
Function _pull_packages_from_manifest Undocumented
Function _refactor_code_for_package In order to refactor the imports properly, we need to recreate the same nested structure as in the project. Therefore, we create: <temp_dir> # also the root of the Rope project |__ <package_name>
Function _refactor_code_for_unpacking This is the reverse operation of _refactor_code_for_package, i.e we go from: <temp_dir> # also the root of the Rope project |__ <micro_package> # or <alias>
Function _rename_files Undocumented
Function _rename_package Rename a Python package, refactoring relevant imports along the way, as well as references in comments.
Function _safe_parse_requirements Safely parse a requirement or set of requirements. This effectively replaces pkg_resources.parse_requirements, which blows up with a ValueError as soon as it encounters a requirement it cannot parse (e...
Function _sync_path_list Undocumented
Function _unpack_sdist Undocumented
Function _validate_dir Undocumented
Constant _SETUP_PY_TEMPLATE Undocumented
Type Alias _SourcePathType Undocumented
@micropkg_cli.group()
def micropkg(): (source)

Commands for working with micro-packages.

@click.group(name='Kedro')
def micropkg_cli(): (source)

Undocumented

@micropkg.command('package')
@env_option(help='Environment where the micro-package configuration lives. Defaults to `base`.')
@click.option('--alias', type=str, default='', callback=_check_pipeline_name, help='Alternative name to package under.')
@click.option('-d', '--destination', type=click.Path(resolve_path=True, file_okay=False), help='Location where to create the source distribution file. Defaults to `dist/`.')
@click.option('--all', '-a', 'all_flag', is_flag=True, help='Package all micro-packages in the `pyproject.toml` package manifest section.')
@click.argument('module_path', nargs=1, required=False, callback=_check_module_path)
@click.pass_obj
def package_micropkg(metadata: ProjectMetadata, module_path, env, alias, destination, all_flag): (source)

Package up a modular pipeline or micro-package as a Python source distribution.

@command_with_verbosity(micropkg, 'pull')
@click.argument('package_path', nargs=1, required=False)
@click.option('--all', '-a', 'all_flag', is_flag=True, help='Pull and unpack all micro-packages in the `pyproject.toml` package manifest section.')
@env_option(help='Environment to install the micro-package configuration to. Defaults to `base`.')
@click.option('--alias', type=str, default='', help='Rename the package.')
@click.option('-d', '--destination', type=click.Path(file_okay=False, dir_okay=False), default=None, help='Module location where to unpack under.')
@click.option('--fs-args', type=click.Path(exists=True, file_okay=True, dir_okay=False, readable=True, resolve_path=True), default=None, help='Location of a configuration file for the fsspec filesystem used to pull the package.')
@click.pass_obj
def pull_package(metadata: ProjectMetadata, package_path, env, alias, destination, fs_args, all_flag, **kwargs): (source)

Pull and unpack a modular pipeline and other micro-packages in your project.

def safe_extract(tar, path): (source)

Undocumented

def _append_package_reqs(requirements_txt: Path, package_reqs: List[str], package_name: str): (source)

Appends micro-package requirements to project level requirements.txt

def _check_module_path(ctx, param, value): (source)

Undocumented

def _create_nested_package(project: Project, package_path: Path) -> Path: (source)

Undocumented

def _find_config_files(source_config_dir: Path, glob_patterns: List[str]) -> List[Tuple[Path, str]]: (source)

Undocumented

def _generate_manifest_file(output_dir: Path): (source)

Undocumented

def _generate_sdist_file(micropkg_name: str, destination: Path, source_paths: Tuple[_SourcePathType, ...], version: str, metadata: ProjectMetadata, alias: str = None): (source)

Undocumented

def _generate_setup_file(package_name: str, version: str, install_requires: List[str], output_dir: Path) -> Path: (source)

Undocumented

def _get_default_version(metadata: ProjectMetadata, micropkg_module_path: str) -> str: (source)

Undocumented

def _get_fsspec_filesystem(location: str, fs_args: Optional[str]): (source)

Undocumented

def _get_package_artifacts(source_path: Path, package_name: str) -> Tuple[Path, Path, Path]: (source)

From existing package, returns in order: source_path, tests_path, config_path

def _get_sdist_name(name, version): (source)

Undocumented

def _install_files(project_metadata: ProjectMetadata, package_name: str, source_path: Path, env: str = None, alias: str = None, destination: str = None): (source)

Undocumented

def _is_within_directory(directory, target): (source)

Undocumented

def _make_install_requires(requirements_txt: Path) -> List[str]: (source)

Parses each line of requirements.txt into a version specifier valid to put in install_requires.

def _move_package(project: Project, source: str, target: str): (source)

Move a Python package, refactoring relevant imports along the way. A target of empty string means moving to the root of the project.

Parameters
project:Projectrope.base.Project holding the scope of the refactoring.
source:strName of the Python package to be moved. Can be a fully qualified module path relative to the project root, e.g. "package.pipelines.pipeline" or "package/pipelines/pipeline".
target:strDestination of the Python package to be moved. Can be a fully qualified module path relative to the project root, e.g. "package.pipelines.pipeline" or "package/pipelines/pipeline".
def _package_micropkg(micropkg_module_path: str, metadata: ProjectMetadata, alias: str = None, destination: str = None, env: str = None) -> Path: (source)

Undocumented

def _package_micropkgs_from_manifest(metadata: ProjectMetadata): (source)

Undocumented

def _pull_package(package_path: str, metadata: ProjectMetadata, env: str = None, alias: str = None, destination: str = None, fs_args: str = None): (source)

Undocumented

def _pull_packages_from_manifest(metadata: ProjectMetadata): (source)

Undocumented

def _refactor_code_for_package(project: Project, package_path: Path, tests_path: Path, alias: Optional[str], project_metadata: ProjectMetadata): (source)

In order to refactor the imports properly, we need to recreate the same nested structure as in the project. Therefore, we create: <temp_dir> # also the root of the Rope project |__ <package_name>

|__ __init__.py |__ <path_to_micro_package>

|__ __init__.py |__ <micro_package>

|__ __init__.py
|__ tests

|__ __init__.py |__ path_to_micro_package

|__ __init__.py |__ <micro_package>

|__ __init__.py

We then move <micro_package> outside of package src to top level ("") in temp_dir, and rename folder & imports if alias provided.

For tests, we need to extract all the contents of <micro_package> at into top-level tests folder. This is not possible in one go with the Rope API, so we have to do it in a bit of a hacky way. We rename <micro_package> to a tmp_name and move it at top-level ("") in temp_dir. We remove the old tests folder and rename tmp_name to tests.

The final structure should be: <temp_dir> # also the root of the Rope project |__ <micro_package> # or <alias>

|__ __init__.py
|__ tests # only tests for <micro_package>
|__ __init__.py |__ test.py
def _refactor_code_for_unpacking(project: Project, package_path: Path, tests_path: Path, alias: Optional[str], destination: Optional[str], project_metadata: ProjectMetadata) -> Tuple[Path, Path]: (source)

This is the reverse operation of _refactor_code_for_package, i.e we go from: <temp_dir> # also the root of the Rope project |__ <micro_package> # or <alias>

|__ __init__.py
|__ tests # only tests for <micro_package>
|__ __init__.py |__ tests.py

to: <temp_dir> # also the root of the Rope project |__ <package_name>

|__ __init__.py |__ <path_to_micro_package>

|__ __init__.py |__ <micro_package>

|__ __init__.py
|__ tests

|__ __init__.py |__ <path_to_micro_package>

|__ __init__.py |__ <micro_package>

|__ __init__.py
def _rename_files(conf_source: Path, old_name: str, new_name: str): (source)

Undocumented

def _rename_package(project: Project, old_name: str, new_name: str): (source)

Rename a Python package, refactoring relevant imports along the way, as well as references in comments.

Parameters
project:Projectrope.base.Project holding the scope of the refactoring.
old_name:strOld module name. Can be a fully qualified module path, e.g. "package.pipelines.pipeline" or "package/pipelines/pipeline", relative to the project root.
new_name:strNew module name. Can't be a fully qualified module path.
def _safe_parse_requirements(requirements: Union[str, Iterable[str]]) -> Set[pkg_resources.Requirement]: (source)

Safely parse a requirement or set of requirements. This effectively replaces pkg_resources.parse_requirements, which blows up with a ValueError as soon as it encounters a requirement it cannot parse (e.g. -r requirements.txt). This way we can still extract all the parseable requirements out of a set containing some unparseable requirements.

def _sync_path_list(source: List[Tuple[Path, str]], target: Path): (source)

Undocumented

def _unpack_sdist(location: str, destination: Path, fs_args: Optional[str]): (source)

Undocumented

def _validate_dir(path: Path): (source)

Undocumented

_SETUP_PY_TEMPLATE: str = (source)

Undocumented

Value
'''# -*- coding: utf-8 -*-
from setuptools import setup, find_packages

setup(
    name="{name}",
    version="{version}",
    description="Micro-package `{name}`",
...
_SourcePathType = (source)

Undocumented

Value
Union[Path, List[Tuple[Path, str]]]