diff --git a/README.md b/README.md index cb8ee3b..5063c63 100644 --- a/README.md +++ b/README.md @@ -57,7 +57,7 @@ kind: Function metadata: name: function-pythonic spec: - package: xpkg.upbound.io/crossplane-contrib/function-pythonic:v0.4.2 + package: xpkg.crossplane.io/crossplane-contrib/function-pythonic:v0.6.0 ``` ### Crossplane V1 @@ -69,7 +69,7 @@ kind: Function metadata: name: function-pythonic spec: - package: xpkg.upbound.io/crossplane-contrib/function-pythonic:v0.4.2 + package: xpkg.crossplane.io/crossplane-contrib/function-pythonic:v0.6.0 runtimeConfigRef: name: function-pythonic -- @@ -266,6 +266,7 @@ The BaseComposite class provides the following fields for manipulating the Compo | self.metadata | Map | The composite observed metadata | | self.spec | Map | The composite observed spec | | self.status | Map | The composite desired and observed status, read from observed if not in desired | +| self.output | Map | The step output, only used during Operations | | self.conditions | Conditions | The composite desired and observed conditions, read from observed if not in desired | | self.results | Results | Returned results applied to the Composite and optionally on the Claim | | self.connectionSecret | Map | The name, namespace, and resourceName to use when generating the connection secret in Crossplane v2 | @@ -280,17 +281,31 @@ The BaseComposite also provides access to the following Crossplane Function leve | self.request | Message | Low level direct access to the RunFunctionRequest message | | self.response | Message | Low level direct access to the RunFunctionResponse message | | self.logger | Logger | Python logger to log messages to the running function stdout | +| self.capabilities | Capabilities | This Crossplane version's Capabilities | | self.parameters | Map | The configured step parameters | | self.ttl | Integer | Get or set the response TTL, in seconds | | self.credentials | Credentials | The request credentials | | self.context | Map | The response context, initialized from the request context | | self.environment | Map | The response environment, initialized from the request context environment | | self.requireds | Requireds | Request and read additional local Kubernetes resources | +| self.schemas | Schemas | Request and read CustomResourceDefinition schemas | | self.resources | Resources | Define and process composed resources | | self.usages| Boolean | Generate Crossplane Usages for resource dependencies, default False | | self.autoReady | Boolean | Perform auto ready processing on all composed resources, default True | | self.unknownsFatal | Boolean | Terminate the composition if already created resources are assigned unknown values, default False | +### Capabiities + +The Capabilities of the Crossplane version calling function-pythonic. + +| Field | Type | Description | +| ----- | ---- | ----------- | +| bool(Capabilities) | Boolean | Whether or not the Crossplane version supports Capabilities | +| Capabiities.requireds | Boolean | Functions can return required resources and Crossplane will fetch the required resources | +| Capabiities.credentials | Boolean | Functions can receive credentials from secrets specified in the Composition | +| Capabiities.conditions | Boolean | Functions can return status conditions to be applied to the XR and optionally its claim | +| Capabiities.schemas | Boolean | Functions can request OpenAPI schemas and Crossplane will return them | + ### Composed Resources Creating and accessing composed resources is performed using the `BaseComposite.resources` field. @@ -324,7 +339,7 @@ Resource class: Creating and accessing required resources is performed using the `BaseComposite.requireds` field. `BaseComposite.requireds` is a dictionary of the required resources whose key is the required -resource name. The value returned when getting a required resource from BaseComposite is the +schema name. The value returned when getting a required resource from BaseComposite is the following RequiredResources class: | Field | Type | Description | @@ -337,9 +352,6 @@ following RequiredResources class: | RequiredResources.matchName | String | The names to match when returning the required resources | | RequiredResources.matchLabels | Map | The labels to match when returning the required resources | -The current version of crossplane-sdk-python used by function-pythonic does not support namespace -selection. For now, use matchLabels and filter the results if required. - RequiredResources acts like a Python list to provide access to the found required resources. Each resource in the list is the following RequiredResource class: @@ -356,6 +368,22 @@ Each resource in the list is the following RequiredResource class: | RequiredResource.conditions | Map | The required resource conditions | | RequiredResource.connection | Map | The required resource connection details | +### Required Schemas + +Creating and accessing required schemas is performed using the `BaseComposite.schemas` field. +`BaseComposite.schemas` is a dictionary of the required schema whose key is the required +resource name. The value returned when getting a required resource from BaseComposite is the +following Schema class: + +| Field | Type | Description | +| ----- | ---- | ----------- | +| Schema(apiVersion,kind) | Schema | Reset the required schema and set the optional parameters | +| Schema.name | String | The required schema name | +| Schema.apiVersion | String | The required schema selector apiVersion | +| Schema.kind | String | The required schema selector kind | +| Schema.\_\_getitem\_\_ | Map | The required schema openAPIV3Schema | +| Schema.\_\_getattr\_\_ | Map | The required schema openAPIV3Schema | + ### Conditions The `BaseComposite.conditions`, `Resource.conditions`, and `RequiredResource.conditions` fields @@ -474,7 +502,7 @@ $ function-pythonic render --help usage: Crossplane Function Pythonic render [-h] [--debug] [--log-name-width WIDTH] [--logger-level LOGGER=LEVEL] [--python-path DIRECTORY] [--render-unknowns] [--allow-oversize-protos] [--crossplane-v1] [--kube-context CONTEXT] [--context-files KEY=PATH] [--context-values KEY=VALUE] [--observed-resources PATH] - [--required-resources PATH] [--secret-store PATH] [--include-full-xr] [--include-connection-xr] + [--required-resources PATH] [--required-schemas PATH] [--include-full-xr] [--include-connection-xr] [--include-function-results] [--include-context] COMPOSITE [COMPOSITION] @@ -506,8 +534,8 @@ options: A YAML file or directory of YAML files specifying the observed state of composed resources. --required-resources, -e PATH A YAML file or directory of YAML files specifying required resources to pass to the Function pipeline. - --secret-store, -s PATH - A YAML file or directory of YAML files specifying Secrets to use to resolve connections and credentials. + --required-schemas, -s PATH + A JSON file or directory of JSON files specifying required schemas to pass to the Function pipeline. --include-full-xr, -x Include a direct copy of the input XR's spedc and metadata fields in the rendered output. --include-connection-xr @@ -576,9 +604,15 @@ status: Most of the examples contain a `render.sh` command which uses `function-pythonic render` to render the example. -## ConfigMap Packages +## Shared Python Packages + +Python packages and modules can be added to the function-pythonic runtime +by including the python code in any of the following resources: ConfigMap, +Secret, EnvironmentConfig, or Composition + +### ConfigMap Packages -ConfigMap based python packages are enable using the `--packages` and +ConfigMap based python packages are enable using the `--packages-configmaps` and `--packages-namespace` command line options. ConfigMaps with the label `function-pythonic.package` will be incorporated in the python path at the location configured in the label value. For example, the following @@ -640,7 +674,7 @@ data: composite: example.pythonic.features.FeatureOneComposite ... ``` -This requires enabling the the packages support using the `--packages` command +This requires enabling the the packages support using the `--packages-configmaps` command line option in the DeploymentRuntimeConfig and configuring the required Kubernetes RBAC permissions. For example: ```yaml @@ -649,7 +683,7 @@ kind: Function metadata: name: function-pythonic spec: - package: xpkg.upbound.io/crossplane-contrib/function-pythonic:v0.4.2 + package: xpkg.crossplane.io/crossplane-contrib/function-pythonic:v0.6.0 runtimeConfigRef: name: function-pythonic --- @@ -711,9 +745,71 @@ ClusterRole permissions. The `--packages-namespace` command line option will res to only using the supplied namespace. This option can be invoked multiple times. The above RBAC permission can then be per namespace RBAC Role permissions. +### Secret Packages + Secrets can also be used in an identical manner as ConfigMaps by enabling the `--packages-secrets` command line option. Secrets permissions need to be -added to the above RBAC configuration. +added to the above RBAC configuration. Secret based python packages also enable +provisioning files with binary data. + +### EnvironmentConfig Packages + +EnvironmentConfig based provisioning enable an entire package and module +directory structure. Use the `--packages-environmentconfigs` command line option +and configure the ClusterRole RBAC access. +```yaml +apiVersion: apiextensions.crossplane.io/v1beta1 +kind: EnvironmentConfig +metadata: + name: test + labels: + function-pythonic.package: 'true' +data: + arootpackage: + asubpackage: + bmodule.py: | + def hello(where): + return f"Hello, {where}!" + amodule.py: | + def goodby(where): + return f"Goodby, {where}!" +``` +### Composition Packages + +Composition based provisioning works just like EnvironmentConfig where a +directory structure is created. Use the `--packages-compositions` command line option +and configure the ClusterRole RBAC access. The main reason to use Composition +based provision is because Compositions can be included in a Crossplane +Configuration Package. +```yaml +apiVersion: apiextensions.crossplane.io/v1 +kind: Composition +metadata: + labels: + function-pythonic.package: 'true' + name: test +spec: + compositeTypeRef: + apiVersion: code.pythoni.com/v1alpha1 + kind: Code + mode: Pipeline + pipeline: + - step: render + functionRef: + name: function-pythonic + input: + apiVersion: pythonic.fn.crossplane.io/v1alpha1 + kind: Composite + packages: + arootpackage: + asubpackage: + bmodule.py: | + def hello(where): + return f"Hello, {where}!" + amodule.py: | + def goodby(where): + return f"Goodby, {where}!" +``` ## Step Parameters diff --git a/crossplane/pythonic/composite.py b/crossplane/pythonic/composite.py index 7083236..1a0f2f1 100644 --- a/crossplane/pythonic/composite.py +++ b/crossplane/pythonic/composite.py @@ -104,11 +104,13 @@ def __init__(self, crossplane_v1, request, logger): ) self.response = protobuf.Message(None, 'response', response.DESCRIPTOR, response) self.logger = logger + self.capabilities = Capabilities(self.request.meta.capabilities) self.parameters = self.request.input.parameters self.credentials = Credentials(self.request) self.context = self.response.context self.environment = self.context['apiextensions.crossplane.io/environment'] self.requireds = Requireds(self) + self.schemas = Schemas(self) self.resources = Resources(self) self.autoReady = True self.usages = False @@ -123,6 +125,7 @@ def __init__(self, crossplane_v1, request, logger): self.metadata = self.observed.metadata self.spec = self.observed.spec self.status = self.desired.status + self.output = self.response.output self.conditions = Conditions(observed, self.response) self.results = Results(self.response) self.events = Results(self.response) # Deprecated, use self.results @@ -136,6 +139,30 @@ async def compose(self): raise NotImplementedError() +class Capabilities: + def __init__(self, capabilities): + self._capabilities = capabilities + + def __bool__(self): + return fnv1.CAPABILITY_CAPABILITIES in self._capabilities + + @property + def requireds(self): + return fnv1.CAPABILITY_REQUIRED_RESOURCES in self._capabilities if self else None + + @property + def credentials(self): + return fnv1.CAPABILITY_CREDENTIALS in self._capabilities if self else None + + @property + def conditions(self): + return fnv1.CAPABILITY_CONDITIONS in self._capabilities if self else None + + @property + def schemas(self): + return fnv1.CAPABILITY_REQUIRED_SCHEMAS in self._capabilities if self else None + + class Credentials: def __init__(self, request): self.__dict__['_request'] = request @@ -558,6 +585,131 @@ def __bool__(self): return bool(self.observed) +class Schemas: + def __init__(self, composite): + self._composite = composite + self._cache = {} + + def __getattr__(self, key): + return self[key] + + def __getitem__(self, key): + schema = self._cache.get(key) + if not schema: + schema = Schema(self._composite, key) + self._cache[key] = schema + return schema + + def __bool__(self): + return bool(len(self)) + + def __len__(self): + names = set() + for name, schema in self._composite.request.required_schemas: + names.add(name) + for name, selector in self._composite.response.requirements.schemas: + names.add(name) + return len(names) + + def __contains__(self, key): + if key in self._composite.request.required_schemas: + return True + if key in self._composite.response.requirements.schemas: + return True + return False + + def __iter__(self): + names = set() + for name, schema in self._composite.request.required_schemas: + names.add(name) + for name, selector in self._composite.response.requirements.schemas: + names.add(name) + for name in sorted(names): + yield name, self[name] + + +class Schema: + def __init__(self, composite, name): + self.name = name + self._selector = composite.response.requirements.schemas[name] + self._schema = composite.request.required_schemas[name].openapi_v3 + + def __call__(self, kind=_notset, apiVersion=_notset): + self._selector() + if kind != _notset: + # Allow for apiVersion in the first arg and kind in the second arg + if '/' in kind or kind == 'v1': + if apiVersion != _notset: + self.kind = apiVersion + apiVersion = kind + else: + self.kind = kind + if apiVersion != _notset: + self.apiVersion = apiVersion + return self + + @property + def apiVersion(self): + return self._selector.api_version + + @apiVersion.setter + def apiVersion(self, apiVersion): + self._selector.api_version = apiVersion + + @property + def kind(self): + return self._selector.kind + + @kind.setter + def kind(self, kind): + self._selector.kind = kind + + def __enter__(self): + return self + + def __exit__(self, exc_type, exc_value, traceback): + pass + + def __aenter__(self): + return self + + def __aexit__(self, exc_type, exc_value, traceback): + pass + + def __getattr__(self, key): + return self[key] + + def __getitem__(self, key): + return self._schema[key] + + def __bool__(self): + return bool(self._schema) + + def __len__(self): + return len(self._schema) + + def __contains__(self, item): + return item in self._schema + + def __iter__(self): + for key, value in self._schema: + yield key, value + + def __hash__(self): + return hash(self._schema) + + def __eq__(self, other): + if instance(other, Schema): + other = other._schema + return self._schema == other + + def __str__(self): + return str(self._schema) + + def __format__(self, spec='yaml'): + return format(self,_schema, spec) + + class Conditions: def __init__(self, observed, response=None): self._observed = observed diff --git a/crossplane/pythonic/function.py b/crossplane/pythonic/function.py index a2de616..c1abe38 100644 --- a/crossplane/pythonic/function.py +++ b/crossplane/pythonic/function.py @@ -121,8 +121,13 @@ async def run_function(self, request): except Exception as e: return self.fatal(request, logger, 'Compose', e) - if requireds := self.get_requireds(step, composite): - logger.debug(f"Requireds requested: {','.join(requireds)}") + schemas = self.get_schemas(step, composite) + requireds = self.get_requireds(step, composite) + if schemas or requireds: + if schemas: + logger.debug(f"Required schemas: {','.join(schemas)}") + if requireds: + logger.debug(f"Required resources: {','.join(requireds)}") else: self.process_usages(composite) self.process_unknowns(composite) @@ -155,11 +160,21 @@ def fatal(self, request, logger, message, exception=None): ] ) + def get_schemas(self, step, composite): + schemas = [] + for name, schema in composite.schemas: + if len(schema.kind) and len(schema.apiVersion): + s = pythonic.Map(kind=schema.kind, apiVersion=schema.apiVersion) + if s != step.schemas[name]: + step.schemas[name] = s + schemas.append(name) + return schemas + def get_requireds(self, step, composite): requireds = [] for name, required in composite.requireds: - if len(required.apiVersion) and len(required.kind): - r = pythonic.Map(apiVersion=required.apiVersion, kind=required.kind) + if len(required.kind) and len(required.apiVersion): + r = pythonic.Map(kind=required.kind, apiVersion=required.apiVersion) if len(required.namespace): r.namespace = required.namespace if len(required.matchName): diff --git a/crossplane/pythonic/grpc.py b/crossplane/pythonic/grpc.py index 2439d59..04551a3 100644 --- a/crossplane/pythonic/grpc.py +++ b/crossplane/pythonic/grpc.py @@ -45,6 +45,12 @@ def add_parser_arguments(cls, parser): parser.add_argument( '--packages', action='store_true', + dest='packages_configmaps', + help='Discover python packages from function-pythonic ConfigMaps, deprecated use --packages-configmaps' + ) + parser.add_argument( + '--packages-configmaps', + action='store_true', help='Discover python packages from function-pythonic ConfigMaps.' ) parser.add_argument( @@ -57,7 +63,17 @@ def add_parser_arguments(cls, parser): action='append', default=[], metavar='NAMESPACE', - help='Namespaces to discover function-pythonic ConfigMaps in, default is cluster wide.', + help='Namespaces to discover function-pythonic ConfigMaps and Secrets in, default is cluster wide.', + ) + parser.add_argument( + '--packages-environmentconfigs', + action='store_true', + help='Also Discover python packages from function-pythonic EnvironmentConfigs.' + ) + parser.add_argument( + '--packages-compositions', + action='store_true', + help='Also Discover python packages from function-pythonic Compositions.' ) parser.add_argument( '--packages-dir', @@ -75,7 +91,9 @@ def initialize(self): if not self.args.tls_certs_dir and not self.args.insecure: print('Either --tls-certs-dir or --insecure must be specified', file=sys.stderr) sys.exit(1) - + if (self.args.packages_environmentconfigs or self.args.packages_compositions) and self.args.packages_namespace: + print('--packages-namespace cannot be used with --packages-environment-configs or --packages-compositions', file=sys.stderr) + sys.exit(1) if self.args.pip_install: import pip._internal.cli.main pip._internal.cli.main.main(['install', '--user', *shlex.split(self.args.pip_install)]) @@ -108,15 +126,18 @@ async def run(self): ) await grpc_server.start() - if self.args.packages: + if self.args.packages_configmaps or self.args.packages_secrets or self.args.packages_environmentconfigs or self.args.packages_compositions: from . import packages async with asyncio.TaskGroup() as tasks: tasks.create_task(grpc_server.wait_for_termination()) tasks.create_task(packages.operator( grpc_server, grpc_runner, + self.args.packages_configmaps, self.args.packages_secrets, self.args.packages_namespace, + self.args.packages_environmentconfigs, + self.args.packages_compositions, self.args.packages_dir, )) else: diff --git a/crossplane/pythonic/packages.py b/crossplane/pythonic/packages.py index e55e4a7..67993c1 100644 --- a/crossplane/pythonic/packages.py +++ b/crossplane/pythonic/packages.py @@ -10,27 +10,38 @@ GRPC_SERVER = None GRPC_RUNNER = None PACKAGES_DIR = None -PACKAGE_LABEL = {'function-pythonic.package': kopf.PRESENT} +PACKAGE_LABEL = 'function-pythonic.package' +PACKAGE_LABELS = {PACKAGE_LABEL: kopf.PRESENT} -def operator(grpc_server, grpc_runner, packages_secrets, packages_namespaces, packages_dir): +def operator(grpc_server, grpc_runner, packages_configmaps, packages_secrets, packages_namespaces, packages_environmentconfigs, packages_compositions, packages_dir): logging.getLogger('kopf.objects').setLevel(logging.INFO) global GRPC_SERVER, GRPC_RUNNER, PACKAGES_DIR GRPC_SERVER = grpc_server GRPC_RUNNER = grpc_runner PACKAGES_DIR = pathlib.Path(packages_dir).expanduser().resolve() sys.path.insert(0, str(PACKAGES_DIR)) + if packages_configmaps: + on_resource('', 'v1', 'configmaps') if packages_secrets: - kopf.on.create('', 'v1', 'secrets', labels=PACKAGE_LABEL)(create) - kopf.on.resume('', 'v1', 'secrets', labels=PACKAGE_LABEL)(create) - kopf.on.update('', 'v1', 'secrets', labels=PACKAGE_LABEL)(update) - kopf.on.delete('', 'v1', 'secrets', labels=PACKAGE_LABEL)(delete) + on_resource('', 'v1', 'secrets') + if not packages_namespaces: + if packages_environmentconfigs: + on_resource('apiextensions.crossplane.io', 'v1beta1', 'environmentconfigs') + if packages_compositions: + on_resource('apiextensions.crossplane.io', 'v1', 'compositions') return kopf.operator( standalone=True, clusterwide=not packages_namespaces, namespaces=packages_namespaces, ) +def on_resource(group, version, plural): + kopf.on.create(group, version, plural, labels=PACKAGE_LABELS)(create) + kopf.on.resume(group, version, plural, labels=PACKAGE_LABELS)(create) + kopf.on.update(group, version, plural, labels=PACKAGE_LABELS)(update) + kopf.on.delete(group, version, plural, labels=PACKAGE_LABELS)(delete) + @kopf.on.startup() async def startup(settings, **_): @@ -42,107 +53,128 @@ async def cleanup(**_): await GRPC_SERVER.stop(5) -@kopf.on.create('', 'v1', 'configmaps', labels=PACKAGE_LABEL) -@kopf.on.resume('', 'v1', 'configmaps', labels=PACKAGE_LABEL) -async def create(body, logger, **_): - package_dir = get_package_dir(body, logger) - if package_dir: - secret = body['kind'] == 'Secret' - for name, text in body.get('data', {}).items(): - package_file_write(package_dir, name, secret, text, 'Created', logger) - - -@kopf.on.update('', 'v1', 'configmaps', labels=PACKAGE_LABEL) -async def update(body, old, logger, **_): - old_package_dir = get_package_dir(old) - if old_package_dir: - old_data = old.get('data', {}) - else: - old_data = {} - old_names = set(old_data.keys()) - package_dir = get_package_dir(body, logger) - if package_dir: - secret = body['kind'] == 'Secret' - for name, text in body.get('data', {}).items(): - if package_dir == old_package_dir and text == old_data.get(name, None): - action = 'Unchanged' - else: - action = 'Updated' if package_dir == old_package_dir and name in old_names else 'Created' - package_file_write(package_dir, name, secret, text, action, logger) - if package_dir == old_package_dir: - old_names.discard(name) - if old_package_dir: - for name in old_names: - package_file_unlink(old_package_dir, name, 'Removed', logger) - - -@kopf.on.delete('', 'v1', 'configmaps', labels=PACKAGE_LABEL) -async def delete(old, logger, **_): - package_dir = get_package_dir(old) - if package_dir: - for name in old.get('data', {}).keys(): - package_file_unlink(package_dir, name, 'Deleted', logger) - - -def get_package_dir(body, logger=None): - package = body.get('metadata', {}).get('labels', {}).get('function-pythonic.package', None) +async def create(resource, labels, body, logger, **_): + resource_create(resource, labels, 'Created', body, logger) + + +async def update(resource, labels, body, old, logger, **_): + resource_delete(resource, labels, 'Removed', old, logger) + resource_create(resource, labels, 'Added', body, logger) + + +async def delete(resource, labels, body, logger, **_): + resource_delete(resource, labels, 'Deleted', body, logger) + + +def resource_create(resource, labels, action, body, logger): + package_dir = resource_package_dir(resource, labels, logger) + if not package_dir: + return + if resource.plural in ('configmaps', 'secrets', 'environmentconfigs'): + package_create(resource, action, package_dir, body.get('data', {}), logger) + elif resource.plural == 'compositions': + for step in body.get('spec', {}).get('pipeline', []): + input = step.get('input') + if input and input.get('apiVersion') == 'pythonic.fn.crossplane.io/v1alpha1': + package_create(resource, action, package_dir, input.get('packages', {}), logger) + + +def resource_delete(resource, labels, action, body, logger): + package_dir = resource_package_dir(resource, labels, logger) + if not package_dir: + return + if resource.plural in ('configmaps', 'secrets', 'environmentconfigs'): + package_delete(action, package_dir, body.get('data', {}), logger) + elif resource.plural == 'compositions': + for step in body.get('spec', {}).get('pipeline', []): + input = step.get('input') + if input and input.get('apiVersion') == 'pythonic.fn.crossplane.io/v1alpha1': + package_delete(action, package_dir, step.get('input', {}).get('packages', {}), logger) + + +def resource_package_dir(resource, labels, logger): + package = labels.get(PACKAGE_LABEL) if package is None: if logger: - logger.error('function-pythonic.package label is missing') + logger.error(f"{PACKAGE_LABEL} label is missing") return None package_dir = PACKAGES_DIR - if package: + if resource.plural in ('configmaps', 'secrets') and package: for segment in package.split('.'): if not segment.isidentifier(): - if logger: - logger.error('Package has invalid package name: %s', package) + logger.error('Package has invalid package name: %s', package) return None package_dir = package_dir / segment return package_dir -def package_file_write(package_dir, name, secret, text, action, logger): - package_file = package_dir / name - if action != 'Unchanged': - package_file.parent.mkdir(parents=True, exist_ok=True) - if secret: - package_file.write_bytes(base64.b64decode(text.encode('utf-8'))) - else: - package_file.write_text(text) - module, name = package_file_name(package_file) - if module: - if action != 'Unchanged': - GRPC_RUNNER.invalidate_module(name) - logger.info(f"{action} module: {name}") - else: - logger.info(f"{action} file: {name}") - - -def package_file_unlink(package_dir, name, action, logger): - package_file = package_dir / name - package_file.unlink(missing_ok=True) - module, name = package_file_name(package_file) - if module: - GRPC_RUNNER.invalidate_module(name) - logger.info(f"{action} module: {name}") - else: - logger.info(f"{action} file: {name}") - package_dir = package_file.parent - while ( - package_dir.is_relative_to(PACKAGES_DIR) - and package_dir.is_dir() - and not list(package_dir.iterdir()) - ): - package_dir.rmdir() - module = str(package_dir.relative_to(PACKAGES_DIR)).replace('/', '.') - if module != '.': - GRPC_RUNNER.invalidate_module(module) - logger.info(f"{action} package: {module}") - package_dir = package_dir.parent - - -def package_file_name(package_file): - name = str(package_file.relative_to(PACKAGES_DIR)) +def package_create(resource, action, package_dir, package, logger): + for name, value in package.items(): + if validate_entry(name, value, logger): + package_name = package_dir / name + if isinstance(value, str): + package_name.parent.mkdir(parents=True, exist_ok=True) + if resource.plural == 'secrets': + package_name.write_bytes(base64.b64decode(value.encode('utf-8'))) + else: + package_name.write_text(value) + module, name = package_file_name(package_name) + if module: + GRPC_RUNNER.invalidate_module(name) + logger.info(f"{action} module: {name}") + else: + logger.info(f"{action} file: {name}") + elif isinstance(value, dict): + package_create(resource, action, package_name, value, logger) + + +def package_delete(action, package_dir, package, logger): + for name, value in package.items(): + if validate_entry(name, value, logger): + package_name = package_dir / name + if isinstance(value, str): + package_name.unlink(missing_ok=True) + module, name = package_file_name(package_name) + if module: + GRPC_RUNNER.invalidate_module(name) + logger.info(f"{action} module: {name}") + else: + logger.info(f"{action} file: {name}") + parent = package_name.parent + while ( + parent.is_relative_to(PACKAGES_DIR) + and parent.is_dir() + and not list(parent.iterdir()) + ): + parent.rmdir() + module = str(parent.relative_to(PACKAGES_DIR)).replace('/', '.') + if module != '.': + GRPC_RUNNER.invalidate_module(module) + logger.info(f"{action} package: {module}") + parent = parent.parent + elif isinstance(value, dict): + package_delete(action, package_name, value, logger) + + +def validate_entry(name, value, logger): + if isinstance(value, str): + if not name.endswith('.py'): + if '.' in name or '/' in name: + logger.error(f"Python package file name is not valid: {name}") + return False + return True + name = name[:-3] + elif not isinstance(value, dict): + logger.error(f"Python package \"{name}\" value is not a valid type: {value.__class__}") + return False + if name.isidentifier(): + return True + logger.error(f"Python package name is not an identifier: {name}") + return False + + +def package_file_name(package_name): + name = str(package_name.relative_to(PACKAGES_DIR)) if name.endswith('.py'): return True, name[:-3].replace('/', '.') return False, name diff --git a/crossplane/pythonic/protobuf.py b/crossplane/pythonic/protobuf.py index 520d93a..85862ca 100644 --- a/crossplane/pythonic/protobuf.py +++ b/crossplane/pythonic/protobuf.py @@ -986,6 +986,8 @@ def __call__(self, *args, **kwargs): elif len(args): for key in range(len(args)): self[key] = args[key] + else: + self._ensure_map() return self def __setattr__(self, key, value): diff --git a/crossplane/pythonic/render.py b/crossplane/pythonic/render.py index a1c31db..fedb8dd 100644 --- a/crossplane/pythonic/render.py +++ b/crossplane/pythonic/render.py @@ -3,6 +3,7 @@ import importlib import inflect import inspect +import json import kr8s import logging import pathlib @@ -12,10 +13,10 @@ from . import ( command, - composite as composite_module, function, protobuf, ) +from .composite import BaseComposite INFLECT = inflect.engine() INFLECT.classical(all=False) @@ -58,7 +59,7 @@ def add_parser_arguments(cls, parser): action='append', default=[], metavar='KEY=VALUE', - help='Context key-value pairs to pass to the Function pipeline. Values must be sYAML/JSON. Keys take precedence over --context-files.', + help='Context key-value pairs to pass to the Function pipeline. Values must be YAML/JSON. Keys take precedence over --context-files.', ) parser.add_argument( '--observed-resources', '-o', @@ -77,12 +78,12 @@ def add_parser_arguments(cls, parser): help='A YAML file or directory of YAML files specifying required resources to pass to the Function pipeline.', ) parser.add_argument( - '--secret-store', '-s', + '--required-schemas', '-s', action='append', type=pathlib.Path, default=[], metavar='PATH', - help='A YAML file or directory of YAML files specifying Secrets to use to resolve connections and credentials.', + help='A JSON file or directory of JSON files specifying required schemas to pass to the Function pipeline.', ) parser.add_argument( '--include-full-xr', '-x', @@ -119,11 +120,10 @@ async def run(self): observed = self.collect_resources(self.args.observed_resources) composition = await self.setup_composition(composite, api) resources = self.collect_resources(self.args.required_resources) - resources += self.collect_resources(self.args.secret_store) - resources.sort(key=lambda resource: str(resource.metadata.name)) + schemas = self.collect_schemas(self.args.required_schemas) context = self.setup_context() - render = await self.render(composite, observed, composition, resources, context, api, self.args.render_unknowns, self.args.crossplane_v1) + render = await self.render(composite, observed, composition, resources, schemas, context, api, self.args.render_unknowns, self.args.crossplane_v1) if not render: sys.exit(1) @@ -212,7 +212,7 @@ async def setup_composition(self, composite, api=None): if not inspect.isclass(clazz): print(f"Composition class {self.args.composition} is not a class", file=sys.stderr) sys.exit(1) - if not issubclass(clazz, composite_module.BaseComposite): + if not issubclass(clazz, BaseComposite): print(f"Composition class {self.args.composition} is not a subclass of BaseComposite", file=sys.stderr) sys.exit(1) return self.create_composition(composite, self.args.composition) @@ -246,8 +246,25 @@ def collect_resources(self, entries): else: print(f"Specified resource is not a file or a directory: {entry}", file=sys.stderr) sys.exit(1) + resources.sort(key=lambda resource: str(resource.metadata.name)) return resources + def collect_schemas(self, entries): + schemas = [] + for entry in entries: + if entry.is_file(): + document = json.loads(entry.read_text()) + schemas.append(protobuf.Value(None, None, document)) + elif entry.is_dir(): + for file in entry.iterdir(): + if file.suffix == '.json': + document = json.loads(file.read_text()) + schemas.append(protobuf.Value(None, None, document)) + else: + print(f"Specified resource is not a file or a directory: {entry}", file=sys.stderr) + sys.exit(1) + return schemas + def setup_context(self): # Load the request context with any specified command line options. context = protobuf.Map() @@ -269,7 +286,7 @@ def setup_context(self): context[key_value[0]] = protobuf.Yaml(key_value[1]) return context - async def render(self, composite, observed=[], composition=None, resources=[], context=None, api=None, render_unknowns=False, crossplane_v1=False, composite_observeds=True): + async def render(self, composite, observed=[], composition=None, resources=[], schemas=[], context=None, api=None, render_unknowns=False, crossplane_v1=False, composite_observeds=True): # Create the request used when running Composition steps. request = protobuf.Message(None, 'request', fnv1.RunFunctionRequest.DESCRIPTOR, fnv1.RunFunctionRequest()) if context is not None: @@ -362,6 +379,10 @@ async def render(self, composite, observed=[], composition=None, resources=[], c request.extra_resources() for name, selector in requirements.extra_resources: await self.set_required(name, selector, request.extra_resources, resources, api) + # Fetch the schemas requested. + request.required_schemas() + for name, selector in requirements.schemas: + await self.set_schema(name, selector, request.required_schemas, schemas, api) # Run the step using the function-pythonic function runner. response = protobuf.Message( None, @@ -546,6 +567,81 @@ async def set_resource(self, source, destination, resources=[], api=None): for key, value in connection.data: destination.connection_details[key] = protobuf.B64Decode(value) + async def set_schema(self, name, selector, schemas, documents=[], api=None): + if not name: + return + name = str(name) + schema = schemas[name].openapi_v3 + schema() # Force this to get created + gvk = protobuf.Map(kind=selector.kind) + version = str(selector.api_version) + if '/' in version: + gvk.group, gvk.version = version.split('/', 1) + else: + gvk.group = '' + gvk.version = version + for document in documents: + if self.find_schema(gvk, document, schema): + return + if api: + if gvk.group == '': + url = f"api/{gvk.version}" + else: + url = f"apis/{gvk.group}/{gvk.version}" + try: + async with api.call_api(base='/openapi/v3', version='', url=url) as response: + document = protobuf.Value(None, None, response.json()) + except kr8s.NotFoundError: + return + self.find_schema(gvk, document, schema) + + def find_schema(self, gvk, document, schema): + for name, s in document.components.schemas: + gvks = s['x-kubernetes-group-version-kind'] + if len(gvks) == 1 and gvks[0] == gvk: + self.resolve_ref(document, set(), f"#/components/schemas/{name}", schema) + return True + return False + + def resolve_ref(self, document, visiting, ref, schema): + if not ref: + return + ref = str(ref) + if ref in visiting: + return + d = None + for segment in ref.split('/'): + if segment == '#': + d = document + else: + d = d[segment] + if not d: + return + visiting.add(ref) + try: + for name, value in d: + self.copy_schema(document, visiting, name, value, schema) + finally: + visiting.remove(ref) + + def copy_schema(self, document, visiting, key, value, schema): + if key == '$ref': + self.resolve_ref(document, visiting, value, schema) + elif key == 'allOf': + if value._isList and len(value) == 1: + self.resolve_ref(document, visiting, value[0]['$ref'], schema) + else: + if value._isMap: + s = schema[key] + for n, v in value: + self.copy_schema(document, visiting, n, v, s) + elif value._isList: + s = schema[key] + for ix, v in enumerate(value): + self.copy_schema(document, visiting, ix, v, s) + else: + schema[key] = value + def copy_resource(self, source, destination): destination.resource = source.resource destination.connection_details() diff --git a/examples/aks-cluster/cluster-function-pythonic.yaml b/examples/aks-cluster/cluster-function-pythonic.yaml index f04a106..17dc736 100644 --- a/examples/aks-cluster/cluster-function-pythonic.yaml +++ b/examples/aks-cluster/cluster-function-pythonic.yaml @@ -3,7 +3,7 @@ kind: Function metadata: name: function-pythonic spec: - package: xpkg.upbound.io/crossplane-contrib/function-pythonic:v0.4.2 + package: xpkg.crossplane.io/crossplane-contrib/function-pythonic:v0.6.0 runtimeConfigRef: name: function-pythonic --- diff --git a/examples/function-go-templating/context/composition.yaml b/examples/function-go-templating/context/composition.yaml index ff189b5..bf8f237 100644 --- a/examples/function-go-templating/context/composition.yaml +++ b/examples/function-go-templating/context/composition.yaml @@ -20,7 +20,7 @@ spec: def compose(self): self.environment.complex = self.requireds['example-config']( 'EnvironmentConfig', - 'apiextensions.crossplane.io/v1alpha1', + 'apiextensions.crossplane.io/v1beta1', name='example-config', )[0].data.complex self.environment.update = 'environment' diff --git a/examples/function-go-templating/context/resources.yaml b/examples/function-go-templating/context/resources.yaml index 1604f2a..ba9deb9 100644 --- a/examples/function-go-templating/context/resources.yaml +++ b/examples/function-go-templating/context/resources.yaml @@ -1,4 +1,4 @@ -apiVersion: apiextensions.crossplane.io/v1alpha1 +apiVersion: apiextensions.crossplane.io/v1beta1 kind: EnvironmentConfig metadata: name: example-config diff --git a/package/input-definition.yaml b/package/input-definition.yaml index f89044e..12f7acf 100644 --- a/package/input-definition.yaml +++ b/package/input-definition.yaml @@ -46,6 +46,7 @@ spec: type: string parameters: type: object + description: Parameters available to the Composite implementation x-kubernetes-preserve-unknown-fields: true composite: type: string @@ -53,5 +54,9 @@ spec: inlined: type: string description: Inlined Composition, the python module is retrieved from the Composite's spec field specified + packages: + type: object + description: Packages added to the python path if enabled + x-kubernetes-preserve-unknown-fields: true served: true storage: true diff --git a/pyproject.toml b/pyproject.toml index 1c204b8..9e061e5 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -56,7 +56,7 @@ only-include = [ features = ["packages"] type = "virtual" path = ".venv-default" -dependencies = ["ipython==9.1.0"] +dependencies = ["ipython==9.11.0"] packages = ["crossplane"] [tool.hatch.envs.default.scripts] production = "python -m crossplane.pythonic.main grpc --insecure" @@ -68,7 +68,7 @@ packages = "python -m crossplane.pythonic.main grpc --insecure --debug --render- type = "virtual" detached = true path = ".venv-lint" -dependencies = ["ruff==0.11.5"] +dependencies = ["ruff==0.15.7"] packages = ["crossplane"] [tool.hatch.envs.lint.scripts] check = "ruff check crossplane tests && ruff format --diff crossplane tests" @@ -78,9 +78,9 @@ type = "virtual" path = ".venv-test" dependencies = [ "grpcio-tools==1.76.0", - "pytest==8.4.1", - "pytest-asyncio==1.1.0", - "pytest-cov==6.2.1", + "pytest==9.0.2", + "pytest-asyncio==1.3.0", + "pytest-cov==7.1.0", ] packages = ["crossplane"] [tool.hatch.envs.test.scripts] @@ -89,7 +89,7 @@ protobuf = "python -m pytest tests/test_protobuf_*.py -x --verbose --verbose --c ci = "python -m pytest tests --verbose --verbose --junitxml=reports/pytest-junit.xml --cov --cov-report=term --cov-report=xml:reports/pytest-coverage.xml" [tool.ruff] -target-version = "py311" +target-version = "py314" exclude = ["crossplane/pythonic/proto/*"] [tool.ruff.lint] diff --git a/tests/examples/function-go-templating/context.yaml b/tests/examples/function-go-templating/context.yaml index 2667e66..4e88afd 100644 --- a/tests/examples/function-go-templating/context.yaml +++ b/tests/examples/function-go-templating/context.yaml @@ -26,7 +26,7 @@ context: iteration: 2 requireds: example-config: - apiVersion: apiextensions.crossplane.io/v1alpha1 + apiVersion: apiextensions.crossplane.io/v1beta1 kind: EnvironmentConfig matchName: example-config apiextensions.crossplane.io/environment: diff --git a/tests/test_packages.py b/tests/test_packages.py new file mode 100644 index 0000000..fca74cc --- /dev/null +++ b/tests/test_packages.py @@ -0,0 +1,489 @@ +import base64 +import importlib +import logging +import sys +import types +from types import SimpleNamespace + +import pytest + + +class DummyRunner: + def __init__(self): + self.invalidated = [] + + def invalidate_module(self, name): + self.invalidated.append(name) + + +def resource(plural): + return SimpleNamespace(plural=plural) + + +class FakeKopfOn: + def __init__(self): + self.registrations = { + 'cleanup': [], + 'create': [], + 'delete': [], + 'resume': [], + 'startup': [], + 'update': [], + } + + def _register(self, event, *args, **kwargs): + def decorator(fn): + self.registrations[event].append((args, kwargs, fn)) + return fn + + return decorator + + def cleanup(self, *args, **kwargs): + return self._register('cleanup', *args, **kwargs) + + def create(self, *args, **kwargs): + return self._register('create', *args, **kwargs) + + def delete(self, *args, **kwargs): + return self._register('delete', *args, **kwargs) + + def resume(self, *args, **kwargs): + return self._register('resume', *args, **kwargs) + + def startup(self, *args, **kwargs): + return self._register('startup', *args, **kwargs) + + def update(self, *args, **kwargs): + return self._register('update', *args, **kwargs) + + +class FakeKopf(types.ModuleType): + def __init__(self): + super().__init__('kopf') + self.PRESENT = object() + self.on = FakeKopfOn() + self.operator_calls = [] + + def operator(self, *args, **kwargs): + self.operator_calls.append((args, kwargs)) + return {'args': args, 'kwargs': kwargs} + + +@pytest.fixture +def packages_module(monkeypatch, tmp_path): + fake_kopf = FakeKopf() + monkeypatch.setitem(sys.modules, 'kopf', fake_kopf) + sys.modules.pop('crossplane.pythonic.packages', None) + packages = importlib.import_module('crossplane.pythonic.packages') + + monkeypatch.setattr(packages, 'PACKAGES_DIR', tmp_path.resolve()) + monkeypatch.setattr(packages, 'GRPC_RUNNER', DummyRunner()) + monkeypatch.setattr(packages, 'GRPC_SERVER', None) + monkeypatch.setattr(sys, 'path', sys.path.copy()) + + yield packages, fake_kopf + + sys.modules.pop('crossplane.pythonic.packages', None) + + +def test_operator_registers_resources_and_sets_global_state( + packages_module, + tmp_path, + monkeypatch, +): + packages, fake_kopf = packages_module + on_resource_calls = [] + server = object() + runner = DummyRunner() + + monkeypatch.setattr( + packages, + 'on_resource', + lambda *args: on_resource_calls.append(args), + ) + + result = packages.operator( + server, + runner, + True, + True, + None, + True, + True, + str(tmp_path), + ) + + assert packages.GRPC_SERVER is server + assert packages.GRPC_RUNNER is runner + assert packages.PACKAGES_DIR == tmp_path.resolve() + assert sys.path[0] == str(tmp_path.resolve()) + assert on_resource_calls == [ + ('', 'v1', 'configmaps'), + ('', 'v1', 'secrets'), + ('apiextensions.crossplane.io', 'v1beta1', 'environmentconfigs'), + ('apiextensions.crossplane.io', 'v1', 'compositions'), + ] + assert fake_kopf.operator_calls == [ + ( + (), + { + 'standalone': True, + 'clusterwide': True, + 'namespaces': None, + }, + ) + ] + assert result['kwargs']['clusterwide'] is True + + +def test_operator_uses_namespaces_and_skips_cluster_resources( + packages_module, + tmp_path, + monkeypatch, +): + packages, fake_kopf = packages_module + on_resource_calls = [] + + monkeypatch.setattr( + packages, + 'on_resource', + lambda *args: on_resource_calls.append(args), + ) + + packages.operator( + object(), + DummyRunner(), + True, + False, + ['team-a'], + True, + True, + str(tmp_path), + ) + + assert on_resource_calls == [('', 'v1', 'configmaps')] + assert fake_kopf.operator_calls == [ + ( + (), + { + 'standalone': True, + 'clusterwide': False, + 'namespaces': ['team-a'], + }, + ) + ] + + +def test_on_resource_registers_handlers(packages_module): + packages, fake_kopf = packages_module + + packages.on_resource('group.example.io', 'v1alpha1', 'widgets') + + for event, handler in ( + ('create', packages.create), + ('resume', packages.create), + ('update', packages.update), + ('delete', packages.delete), + ): + args, kwargs, fn = fake_kopf.on.registrations[event][-1] + assert args == ('group.example.io', 'v1alpha1', 'widgets') + assert kwargs == {'labels': packages.PACKAGE_LABELS} + assert fn is handler + + +@pytest.mark.asyncio +async def test_startup_and_cleanup(packages_module, monkeypatch): + packages, _ = packages_module + settings = SimpleNamespace(scanning=SimpleNamespace(disabled=False)) + server = SimpleNamespace(stop=None, stop_calls=[]) + + async def stop(grace): + server.stop_calls.append(grace) + + monkeypatch.setattr(server, 'stop', stop) + monkeypatch.setattr(packages, 'GRPC_SERVER', server) + + await packages.startup(settings) + await packages.cleanup() + + assert settings.scanning.disabled is True + assert server.stop_calls == [5] + + +@pytest.mark.parametrize( + ('name', 'value', 'expected'), + [ + ('module.py', 'print(1)\n', True), + ('data', 'payload', True), + ('package', {}, True), + ('bad-name.py', 'print(1)\n', False), + ('bad.name', 'payload', False), + ('bad/name', 'payload', False), + ('package', 1, False), + ], +) +def test_validate_entry(packages_module, caplog, name, value, expected): + packages, _ = packages_module + logger = logging.getLogger(__name__) + + with caplog.at_level(logging.ERROR): + assert packages.validate_entry(name, value, logger) is expected + + if expected: + assert not caplog.messages + else: + assert caplog.messages + + +def test_resource_package_dir_uses_label_for_configmaps_and_secrets( + packages_module, + tmp_path, +): + packages, _ = packages_module + logger = logging.getLogger(__name__) + labels = { + 'function-pythonic.package': 'pkg.subpackage', + } + + package_dir = packages.resource_package_dir( + resource('configmaps'), + labels, + logger, + ) + + assert package_dir == tmp_path / 'pkg' / 'subpackage' + + +def test_resource_package_dir_validates_labels(packages_module, caplog, tmp_path): + packages, _ = packages_module + logger = logging.getLogger(__name__) + + with caplog.at_level(logging.ERROR): + assert packages.resource_package_dir(resource('configmaps'), {}, logger) is None + assert ( + packages.resource_package_dir( + resource('secrets'), + { + 'function-pythonic.package': 'not-valid.segment', + }, + logger, + ) + is None + ) + + assert caplog.messages == [ + 'function-pythonic.package label is missing', + 'Package has invalid package name: not-valid.segment', + ] + assert packages.resource_package_dir( + resource('compositions'), + { + 'function-pythonic.package': 'ignored.for.compositions', + }, + logger, + ) == tmp_path + + +def test_package_file_name_maps_python_modules(packages_module, tmp_path): + packages, _ = packages_module + + assert packages.package_file_name(tmp_path / 'pkg' / 'mod.py') == ( + True, + 'pkg.mod', + ) + assert packages.package_file_name(tmp_path / 'pkg' / 'data.txt') == ( + False, + 'pkg/data.txt', + ) + + +def test_package_create_writes_files_and_invalidates_modules( + packages_module, + tmp_path, +): + packages, _ = packages_module + logger = logging.getLogger(__name__) + runner = DummyRunner() + + packages.GRPC_RUNNER = runner + + packages.package_create( + resource('configmaps'), + 'Created', + tmp_path, + { + 'pkg': { + '__init__.py': '', + 'module.py': 'value = 1\n', + 'data': 'payload', + }, + 'bad-name.py': 'ignore me', + }, + logger, + ) + + assert (tmp_path / 'pkg' / '__init__.py').read_text() == '' + assert (tmp_path / 'pkg' / 'module.py').read_text() == 'value = 1\n' + assert (tmp_path / 'pkg' / 'data').read_text() == 'payload' + assert not (tmp_path / 'bad-name.py').exists() + assert runner.invalidated == ['pkg.__init__', 'pkg.module'] + + +def test_package_create_decodes_secrets(packages_module, tmp_path): + packages, _ = packages_module + logger = logging.getLogger(__name__) + runner = DummyRunner() + + packages.GRPC_RUNNER = runner + + packages.package_create( + resource('secrets'), + 'Created', + tmp_path, + { + 'secret_module.py': base64.b64encode(b'print("secret")\n').decode( + 'utf-8' + ), + }, + logger, + ) + + assert (tmp_path / 'secret_module.py').read_bytes() == b'print("secret")\n' + assert runner.invalidated == ['secret_module'] + + +def test_package_delete_removes_files_and_empty_directories( + packages_module, + tmp_path, +): + packages, _ = packages_module + logger = logging.getLogger(__name__) + runner = DummyRunner() + + packages.GRPC_RUNNER = runner + package_dir = tmp_path / 'pkg' / 'sub' + package_dir.mkdir(parents=True) + (package_dir / 'module.py').write_text('value = 1\n') + (package_dir / 'data').write_text('payload') + + packages.package_delete( + 'Deleted', + tmp_path, + { + 'pkg': { + 'sub': { + 'module.py': 'value = 1\n', + 'data': 'payload', + }, + }, + }, + logger, + ) + + assert not (package_dir / 'module.py').exists() + assert not (package_dir / 'data').exists() + assert not package_dir.exists() + assert not (tmp_path / 'pkg').exists() + assert runner.invalidated == ['pkg.sub.module', 'pkg.sub', 'pkg'] + + +@pytest.mark.asyncio +async def test_create_and_delete_handle_composition_pipeline( + packages_module, + tmp_path, + monkeypatch, +): + packages, _ = packages_module + create_calls = [] + delete_calls = [] + logger = logging.getLogger(__name__) + composition_resource = resource('compositions') + labels = {'function-pythonic.package': 'pkg'} + body = { + 'kind': 'Composition', + 'spec': { + 'pipeline': [ + { + 'input': { + 'apiVersion': 'pythonic.fn.crossplane.io/v1alpha1', + 'packages': { + 'module.py': 'value = 1\n', + }, + }, + }, + { + 'input': { + 'apiVersion': 'other.example.io/v1alpha1', + 'packages': { + 'ignored.py': 'value = 2\n', + }, + }, + }, + ], + }, + } + + monkeypatch.setattr( + packages, + 'resource_package_dir', + lambda resource, labels, logger: tmp_path, + ) + monkeypatch.setattr( + packages, + 'package_create', + lambda *args: create_calls.append(args), + ) + monkeypatch.setattr( + packages, + 'package_delete', + lambda *args: delete_calls.append(args), + ) + + await packages.create(composition_resource, labels, body, logger) + await packages.delete(composition_resource, labels, body, logger) + + assert create_calls == [ + ( + composition_resource, + 'Created', + tmp_path, + {'module.py': 'value = 1\n'}, + logger, + ), + ] + assert delete_calls == [ + ('Deleted', tmp_path, {'module.py': 'value = 1\n'}, logger), + ] + + +@pytest.mark.asyncio +async def test_update_deletes_old_before_creating_new(packages_module, monkeypatch): + packages, _ = packages_module + calls = [] + logger = logging.getLogger(__name__) + configmaps_resource = resource('configmaps') + labels = {'function-pythonic.package': 'pkg'} + body = {'kind': 'ConfigMap'} + old = {'kind': 'ConfigMap'} + + def resource_delete(resource, labels, action, value, logger): + calls.append(('delete', resource, labels, value, action, logger)) + + def resource_create(resource, labels, action, value, logger): + calls.append(('create', resource, labels, value, action, logger)) + + monkeypatch.setattr(packages, 'resource_delete', resource_delete) + monkeypatch.setattr(packages, 'resource_create', resource_create) + + await packages.update( + resource=configmaps_resource, + labels=labels, + body=body, + old=old, + logger=logger, + ) + + assert calls == [ + ('delete', configmaps_resource, labels, old, 'Removed', logger), + ('create', configmaps_resource, labels, body, 'Added', logger), + ]