dml_util.funk

dml_util.funk#

Functions

aws_fndag()

funkify([fn, uri, data, adapter, extra_fns, ...])

Decorator to funkify a function into a DML Resource.

dml_util.funk.aws_fndag()[source]#
dml_util.funk.funkify(fn=None, uri='script', data=None, adapter='local', extra_fns=(), extra_lines=())[source]#

Decorator to funkify a function into a DML Resource.

Parameters:
  • fn (callable, Resource, optional) – The function to funkify.

  • uri (str, optional) – The URI for the resource. Defaults to “script”.

  • data (dict, optional) – Additional data to include in the resource. Defaults to None.

  • adapter (str | Resource, optional) – The adapter to use for the resource. Defaults to “local”.

  • extra_fns (tuple, optional) – Additional functions to include in the script. Defaults to an empty tuple.

  • extra_lines (tuple, optional) – Additional lines to include in the script. Defaults to an empty tuple.

Returns:

A DML Resource representing the funkified function.

Return type:

Resource

Notes

This is meant to be used as a decorator.

Examples

>>> @funkify
... def my_function(dag):
...     dag.result = "Hello, DML!"

They’re composable so you can stack them: >>> @funkify(uri=”docker”, data={“image”: “my-python:3.8”}) … @funkify(uri=”hatch”, data={“name”: “example”}) … @funkify … def another_function(dag): … return “This function runs in the name hatch env, inside a docker image.”

And later, if you have a batch cluster set up, you can run it in batch: >>> funkify(another_function, data={“image”: “my-python:3.8”}, adapter=dag.batch.value()) # doctest: +SKIP