Functional API
The Functional API
Welcome to your first Synalinks lesson! In this tutorial, you will learn how to build AI applications using the Functional API - the most intuitive and recommended approach for creating programs.
What is Synalinks?
Synalinks is a framework for building AI applications powered by Large Language Models (LLMs). Think of it like building with LEGO blocks - you connect different pieces (called "modules") together to create something useful.
Core Concepts
1. Programs and Modules
A Program in Synalinks is like a recipe - it defines the steps your AI application will follow. Each step is performed by a Module, which is a reusable building block.
graph LR
Input --> Module1[Module 1] --> Module2[Module 2] --> Output
2. Data Models
Data flows through your program in structured formats called DataModels. Think of them as blueprints that define what information looks like:
3. The Functional API
The Functional API lets you build programs by:
- Creating an Input placeholder
- Passing it through modules (like connecting pipes)
- Wrapping everything in a Program
# Step 1: Define where data enters
inputs = synalinks.Input(data_model=Query)
# Step 2: Pass through a module (Generator uses an LLM to create output)
outputs = await synalinks.Generator(
data_model=Answer,
language_model=language_model,
)(inputs)
# Step 3: Create the program
program = synalinks.Program(inputs=inputs, outputs=outputs)
Complete Example
Here's a complete Chain of Thought program that shows reasoning before answering:
import asyncio
from dotenv import load_dotenv
import synalinks
# Define input and output data models
class Query(synalinks.DataModel):
query: str = synalinks.Field(description="The user query")
class AnswerWithThinking(synalinks.DataModel):
thinking: str = synalinks.Field(description="Your step by step thinking")
answer: str = synalinks.Field(description="The correct answer")
async def main():
load_dotenv()
language_model = synalinks.LanguageModel(model="openai/gpt-4.1")
# Build with Functional API
inputs = synalinks.Input(data_model=Query)
outputs = await synalinks.Generator(
data_model=AnswerWithThinking,
language_model=language_model,
)(inputs)
program = synalinks.Program(
inputs=inputs,
outputs=outputs,
name="chain_of_thought",
description="Useful to answer in a step by step manner.",
)
# Run the program
result = await program(Query(query="What are the key aspects of human cognition?"))
print(f"Thinking: {result['thinking']}")
print(f"Answer: {result['answer']}")
asyncio.run(main())
Key Takeaways
- Functional API: Build programs by connecting modules like pipes - data
flows from
Inputthrough modules to create outputs. - Three Steps: (1) Create an Input, (2) Pass through modules, (3) Wrap in a Program.
- Generator Module: The core module that uses an LLM to transform input data into structured output matching your data model.
- Reusable Programs: Once built, programs can be called like functions with your input data model.
Program Visualization
API References
AnswerWithThinking
Bases: DataModel
The output from our program - reasoning + final answer.
By asking the LLM to show its thinking, we get better answers. This is called "Chain of Thought" prompting.
