At this stage, Genie is probably familiar to everyone on a general level. With Genie, you can discuss and analyze data as if you were chatting with an old friend. To get the best possible benefit from Genie, it needs to be optimized and focused on a single entity, such as customer data. Previously, this created limitations because you had to jump from one Genie Space to another based on your needs – but not anymore.
Databricks recently released the long-awaited API interface for Genie in Public Preview. This allows you to chain Genie spaces together using the Executor Agent, enabling you to effortlessly manage everything from a single interface. The Executor Agent automatically passes your queries to the appropriate Genie Space, making this scalable and easy to manage. For example, you can initially inquire about sales data from the "Sales Genie," then related data pipelines from the "Metadata Genie," and finally create a DevOps ticket – all through one interface. In this article, I will explain and share the code for building a simple multi-Genie-Agent solution on top of Databricks Apps. A link to the code repository can be found here.
All good things begin with great architecture
Heh, I'm starting to repeat this mantra a lot and the same applies now. First, we create a battle plan and then we move on to building the implementation. In this case, the high-level architecture was quite clear from the beginning, but as with every data project, there were variables along the way that brought slight changes to the final practical implementation. Here is the final solution architecture diagram of the implementation. The beauty of the multi-agent solution is that you can keep each Genie species totally focused on a specific topic and treat it as an isolated entity. Different teams and subject matter experts can keep developing those without external dependencies. And then you piece the puzzle together from different Genie parts, offering an effortless comprehensive solution.

The first step is setting up Genie Spaces
First, I started by creating and optimizing Genie Spaces. I decided to create two different spaces: one for fabricated customer data. For this, I created three separate tables, containing customers, purchase history, feedback, and maintenance costs as well as sales revenues, letting AI handle this part. The second space I created was optimized for Databricks system tables, specifically for workflow history, costs, and job metadata. The purpose is to demonstrate the utilization of multiple Genies through the same agent solution, so I decided to stick with two at this stage. More Genies can be added using the same logic.
Configuring Genie REST API
Next, I proceeded to test the Genie REST API. Here, I decided to use pure REST API calls instead of using the Databricks SDK. Databricks offers amazing REST API documentation and especially with the new features I prefer REST API. It helps me to understand the logic better and build even more advanced solutions automation on top of it. The logic works as follows: first, you send a prompt to Genie, then you poll for the response based on the message ID. I set up polling to occur every second (if the serverless SQL warehouse used by Genie is inactive, it takes a few seconds to start up) with a timeout of 20 seconds. Within this timeframe, I expect Genie to process the query, perform data retrieval, and return the response. If Genie runs an SQL query, the results must be fetched with a third query using the attachment ID. It is important to note the different variations in result output, whether it is a description or text output. You can see this code part in the genie_functions.py file and use it for your own purposes.
Creating Databricks Apps and adding the executor agent
Databricks offers excellent pre-built templates for creating Apps, so I naturally utilized them for this project as well. I chose the Dash chatbot template, which provides an easy-to-use interface and a solid foundation for further development. I added functions as separate Python files for clarity but kept visual changes minimal. Remember that AI is your best friend for creating and updating nice visual effects.
Since this is a demo project, I very carefully limited the functionalities. For example, in the context of DevOps ticket creation, only one epic can be created, with the agent able to modify its body. It's done using unique static epic subject which is being fetched every time. If it exists, updating it and if not, creating it again. However, the created connections in Unity Catalog are dynamic, making it very easy to add all DevOps REST API functionalities. The functionalities of the executor agent are also limited, being a function call agent with temporary short-term memory. Genie spaces are added as tools for the executor agent, making it a very user-friendly and efficient solution.
Setting up proper authentication - the fun begins here
Initially, I planned to create the Genies as individual subagents for Unity Catalog functions, but the problem was that SQL warehouses do not yet support ExternalFunctionRequestHttpMethod from the Databricks SDK (understandable, as it's a very new functionality). Thus, I wouldn't be able to easily wrap the Genie subagents into user-friendly Unity Catalog functions. Next, I tried to create dedicated agents for each Genie subagent and deploy them via Mosaic AI. The problem here was that Databricks hosts through a general system Service Principal, which cannot be granted permissions at all in my Databricks environment. And currently, the SDK does not support the use of tools yet, so direct model endpoint usage through it was also out of the question.
At this point, I decided to completely change my approach. When creating Databricks Apps, a dedicated service principal is automatically created, which can be granted permissions to manage access rights and connections. I encountered a few hiccups with header-level authentication, and since this was a demo project, I didn’t want to spend too much time on it. Initially, I created custom access to dbutils using the SDK documentation as a guide. However, I later remembered that Databricks Apps support Key Vault Secrets as resource dependencies, which offered a cleaner solution. It’s good to remember that if creating Apps in UI, secrets are locked at first - they need to be added manually afterward. Also, remember to modify your code after updates — files won’t be automatically populated when using pre-built templates.
In short, I created external connections in Unity Catalog for Genie Spaces and the DevOps REST API, granted the Apps service principal the necessary permissions for these connections and for a dedicated secret scope, and used my Databricks access token stored in secrets to authenticate the OpenAI client. It worked like a charm! Although I used access tokens here, using OAuth is highly recommended for production solutions.

And finally, time to use it
Now that Genie Spaces are ready, Databricks apps created and authentication taken care of, it is time to start using the solution! Here you can see a quick demo how everything is working smoothly. The Executor Agent determines for each query whether to trigger a dedicated Genie Space, create a DevOps ticket, or enrich the prompt with document data.
Link to the code & repo
Here you can find link to the repo! It contains a deployment notebook to help with the process, and all the code is located in the Apps folder, making it convenient to deploy using your chosen Infrastructure as Code (IaC) method.
Deployment possibilities
The repo contains a deployment notebook that handles the process via REST API for you. However, keep in mind that there are other deployment options, such as Databricks Asset Bundles (DAB). DAB is easy to set up, integrates well with CI/CD pipelines, and allows for the addition of dynamic variables (like global tagging, etc.).
More good readings on Genie optimization
It is advisable to allocate time for optimizing Genie spaces and make it an iterative process. Fortunately, there is already plenty of good material available, and here is a small preview of what you can explore.
On the articles side, here are a few easy articles:
On the video side, an example is the excellent videos made by Advancing Analytics:

Written by Aarni Sillanpää
More is better - especially with Genies!