Skip to content

Giving a microservices backend to your Streamlit app

Why it is a good idea

The Daisi platform deploys Python functions as serverless microservices. Serverless means that the hardware resources are allocated dynamically at runtime, when needed. When a request is made for the execution of a function, a service will be started automatically and it will stay alive for a maximum of 60 minutes following the last execution. Special hardware can be allocated for the execution of the Daisi, like GPUs for instance.

Daisi can also host Streamlit apps. Streamlit apps are not deployed in the same way as Daisi functions. They have a much longer lifecycle as it isn't straightforward to assess if a user has finished a session in the app. Moving forward, Streamlit apps will not have access to special hardware. They will delegate resource intensive computations to Daisi functions, accessed with the pydaisi Python package.

So if a function requires significant resources or hardware to run (like a large ML model for instance), it is a good idea to deploy it as an individual Daisi and put the Streamlit app in a different Daisi which will call the first one. This will ensure optimal execution for the resource intensive Daisi, and optimal usage of the resources of the community Daisi platform.

In addition, it allows others to reuse the function in their own Streamlit app.

Decoupling Streamlit apps from resource intensive functions

Let's consider the following example, which can be deployed as Daisi with a Streamlit app. Let's assume that function1 is resource intensive and would need to access GPUs.

import streamlit as st

def function1(param):
    # Do something 

    return value

def st_ui():

    st.title("My Streamlit app")
    param = st.text_input("Enter param")
    value = function1(param)

    st.write(value)

if __name__ == "__main__":
    st_ui()

A good practice in this case to separate the app from the function and to create two separate Daisies:

Daisi 1 (deploys resource intensive function):

def function1(param):
    # Do something 

    return value

And Daisi 2 (deploys only the Streamlit app):

import streamlit as st
import pydaisi as pyd

daisi_1 = pyd.Daisi("username/Daisi_1") #Initialize a Daisi object, linking to "Daisi 1" deployment 

def st_ui():

    st.title("My Streamlit app")
    param = st.text_input("Enter param")
    value = daisi_1.function1(param).value #Trigger a Daisi 1 execution with the function1 endpoint

    st.write(value)

if __name__ == "__main__":
    st_ui()

In this case, Daisi 1 is a separate serverless microservice which can be allocated specific hardware or configuration (GPU, parallel compute, etc...).

An additional benefit of this approach is that it allows to monitor closely the performance of the resource intensive Daisi through the metric tabs of the Daisi page in the platform : number of calls, latency, execution time, proportion of successful executions.