Skip to content

Simple executions

Calling a Daisi

The pattern to call a Daisi is Daisi("username/daisiname"), except if the Daisi is public and its name is unique, in which case you can ommit "username".

The Call it tab in each Daisi page on app.daisi.io features a snippet of code that you can copy/paste to call a Daisi immediately (Pending that you are authentified. Only public Daisies don't require authentication).

Simple invokations

Synchronous calls

You can invoke a Daisi endpoint, it will run until complete, and the result will be available in the value attribute when it has returned.

import pydaisi as pyd

# Instantiate a Daisi object
classify =  pyd.Daisi("Zero Shot Text Classification")

# Synchronous call
result = classify.compute(text="Let's go the moon", 
                          candidate_labels="astronomy, travel").value

Execution Status

A Daisi's execution status can be accessed with the status property:

with pyd.Daisi("Add Two Numbers") as my_daisi:
    de = my_daisi.compute(firstNumber=5, secondNumber=6)
    print(de.status)

Status is one of the following:

  • NOT STARTED
  • RUNNING
  • FINISHED
  • FAILED

When the execution status value is FINISHED, you can fetch the result:

result = de.value
print(result)

Execution Logs

A Daisi's logs can be accessed with the logs property:

import time

with pyd.Daisi("Live Logging") as my_daisi:
    de = my_daisi.live_log_test(firstNumber=5, secondNumber=6, delay=3)
    de.start()
    time.sleep(2)
    print(de.logs)

Tracebacks

When your code fails and produce a traceback, it will be reported in the .value attribute.

Passing and receiving arbitrary objects

Daisies can be passed and send back arbitrary Python objects, as if they were simply functions in a Python module. There is nothing special to do to pass arguments to a Daisi. However, as for any Python module, if you pass custom objects, their definition needs to be available to the Daisi.

Consider this example with "Daisi Serialize":

The code of "Daisi Serialize" Daisi is the following :

import numpy as np
import tensorflow

class MapStack:
    def __init__(self, nx, ny):
        self.nx = nx
        self.ny = ny
        self.nb_layers = None
        self.maps = []

    def add_layer(self, map):
        if len(map.shape) == 2 and map.shape[0] == self.ny and map.shape[1] == self.nx:
            self.maps.append(map)
            self.nb_layers = len(self.maps)
            return "Map sucessfully added."
        else:
            return "Could not add map. Incompatible dimensions."

def compute(map_stack, map):
    status = map_stack.add_layer(map)
    print(status)

    return map_stack

The compute function of this Daisi receives and returns a MapStack object.

The code below illustrates how we can interact with this Daisi :

  • Instantiate a MapStack object
  • Pass it to the Daisi
  • Receive a new MapStack object
import pydaisi as pyd
import numpy as np

# Connect to the "Daisi Serialize" Daisi
daisi_serialize = pyd.Daisi("exampledaisies/Daisi Serialize")

# Class definition
class MapStack:
    def __init__(self, nx, ny):
        self.nx = nx
        self.ny = ny
        self.nb_layers = None
        self.maps = []

    def add_layer(self, map):
        if len(map.shape) == 2 and map.shape[0] == self.ny and map.shape[1] == self.nx:
            self.maps.append(map)
            self.nb_layers = len(self.maps)
            return "Map successfully added."
        else:
            return "Could not add map. Incompatible dimensions."

# Initialize a new GridStack object with 10 layers
nx = 200
ny = 200
ms = MapStack(nx, ny)
for i in range(10):
    ms.add_layer(np.random.rand(nx, ny))

# Invoke the "compute()" endpoint, add a new layer to the stack 
# and receive back the updated Stack object
d_execution = daisi_serialize.compute(map_stack=s, map=np.random.rand(nx,)).value

assert d_execution.nb_layers == 10

Asynchronous calls and long running executions

The Daisi plateform supports long running executions. To trigger a long running execution, invoke the endpoint asynchronously. The DaisiExecution object will return immediately an execution_id. You will monitor the execution progress with the .status and .logs properties.

To call an endpoint asynchronously, simply add a _ suffix at the end of the endpoint name.

Example:

greeting = pyd.Daisi("Print Hello")
async_exec = greeting.hello(name = "World")
async_exec.start()

print(async_exec.status)

# Then fetch the `DaisiExecution` result with the `.value` attribute when ready

print(async_exec.value)

Remote Results

You need not fetch the full data of a Daisi Execution in order to chain it to the computation of another daisi ! Consider this example with "Daisi Serialize":

import pydaisi as pyd

# Connect to the "Daisi Serialize" Daisi
d = pyd.Daisi("Daisi Serialize")

# Define the GridStack class that we will 
# use as an example of custom serialization. A GridStack 
# object will be passed in argument of the Daisi.
# The "Daisi Serialize" Daisi contains also 
# a definition of the class.

import numpy as np

class GridStack:
    def __init__(self, nx, ny):
        self.nx = nx
        self.ny = ny
        self.nb_layers = None
        self.grids = []

    def add_layer(self, grid):
        if grid.shape == (self.ny, self.nx):
            self.maps.append(grid)
            self.nb_layers = len(self.grids)
            return "Map sucessfully added."
        else:
            return "Could not add map. Incompatible dimensions."

# Initialize a new GridStack object with 10 layers
gs = GridStack(nx=200, ny=200)
for i in range(10):
    ms.add_layer(np.random.rand(nx, ny))

# Invoke the daisi endpoint asynchronously, adding a new layer to the grid stack
d_execution = d.compute_(grid_stack=ms, grid=np.random.rand(nx, ny))

The DaisiExecution object d_execution contains simply a reference to the remote result unless it is explicitely fetched. There is no need to fetch the result in order to pass it as an input to the next daisi !

# Recompute the daisi, adding a another new layer and fetch the result
result = d.compute_(grid_stack=d_execution, grid=np.random.rand(nx, ny)).value

assert result.nb_layers == 12

Map parallel executions