GET FLOOM DOCKER
GET FLOOM SDK

Floom, the AI Gateway

Elevate apps with AI
Open source ❤️ By devs, for devs.

Data Ingestion
Security
Cost Control
Caching
PEOPLE loving floom WORK FOR
new ai PIPELINE
New Pipeline

kind: 'floom/pipeline/1.2'

pipeline:
  name: my-first-pipeline
  
  # Use 🧠 OpenAI-GPT 3.5
  model:
    - package: floom/model/connectors/openai
      model: gpt-3.5
      api-key: [Your-OpenAI-API-KEY]
      
  prompt:
    # ❌ Filter Personal Details
    validation:
      - package: floom/plugins/privacy
        disallow: ['credit-card', 'ssn', 'email', 'phone']
        
  response:
    # ❌ Filter Profanity
    validation:
     - package: floom/plugins/bad-words-filter 
       disallow: ['profanity']
  
  global:
    # 💵 Cost limit
    - package: floom/plugins/cost-management 
      limits:
        user:
         - day: 1000 # max tokens per day
        pipeline:
         - month: 600000 # max tokens per month
			
    # ⚡Caching
    - package: floom/plugins/cache 
      cache-type: memory

kind: 'floom/pipeline/1.2'

pipeline:
  name: my-first-pipeline
  
  # Use 🧠 OpenAI DALL-E
  model:
    - package: floom/model/connectors/openai
      model: dall-e
      api-key: [Your-OpenAI-API-KEY]
      
  prompt:  
    template: 
      package: floom/prompt/templates/default
      prompt: "Create a beatiful 19th century sktech style image of {content}"
        
  response:
    width: 400
    height: 600
    format: jpeg
    quality: 0.7
  
    # ❌ Filter Nudity
    validation:
     - package: floom/plugins/nudity-filter
  
  global:
    # 💵 Cost limit
    - package: floom/plugins/cost-management 
      limits:
        user:
         - day: 200 # max tokens per day
        pipeline:
         - month: 300000 # max tokens per month
			
    # ⚡Caching
    - package: floom/plugins/cache 

kind: 'floom/pipeline/1.2'

pipeline:
  name: my-first-pipeline
  
  # Use 🧠 OpenAI-GPT 3.5
  model:
    - package: floom/model/connectors/openai
      model: gpt-3.5
      api-key: [Your-OpenAI-API-KEY]
      
  prompt:  
    template: 
      package: floom/prompt/templates/default
      system: "You are helpful assistant" 
    
    # 📄 tutorial.pdf as context
    context:
      - package: floom/prompt/context/pdf
        path: /etc/tutorial.pdf
        
    # ❌ Filter Sensitive Information   
    validation:
      - package: floom/plugins/sensitive-info
        disallow: ['pii', 'credit-cards', 'phone-numbers']
        
  response:
    # ✔️ Response should be 3 sentences max, in English (US) 
    format:
     - package: floom/response/formatter 
       type: text
       language: en-us
       max-sentences: 3
        
    # ❌ Filter Profanity
    validation:
     - package: floom/plugins/bad-words-filter 
       disallow: ['profanity']
       language: en-us
  
  global:
    # 💵 Cost limit
    - package: floom/plugins/cost-management 
      limits:
        user:
         - day: 1000 # max tokens per day
        pipeline:
         - month: 600000 # max tokens per month
			
    # ⚡Caching
    - package: floom/plugins/cache 
      cache-type: memory

kind: 'floom/pipeline/1.2'

pipeline:
  name: my-first-pipeline
  
  # Use 🧠 Mistral-7B on AWS Bedrock
  model:
    - package: floom/model/connectors/Ollama
      model: mistral
      endpoint: 172.91.58.1:8090
      
  prompt:  
    template: 
      package: floom/prompt/templates/default
      system: "When you classify, be concise, determined" 
      prompt: "Is the person writing this sad or happy? he wrote: '{userContent}'"
        
  response:
    # ✔️ Response should be 1 word, in English (US) 
    format:
     - package: floom/response/formatter 
       type: text
       language: en-us
       max-words: 1
       allow: ['sad', 'happy']
        
  global:
    # 💵 Cost limit
    - package: floom/plugins/cost-management 
      limits:
        user:
         - day: 1000 # max tokens per day
        pipeline:
         - month: 600000 # max tokens per month
			
    # ⚡Caching
    - package: floom/plugins/cache 
      cache-type: memory

kind: 'floom/pipeline/1.2'

pipeline:
  name: my-first-pipeline
  
  # Use 🧠 Llama 2 on Azure
  model:
    - package: floom/model/connectors/ollama
      model: llama-2
      endpoint: 92.15.19.6:6060
      
  prompt:  
    template: 
      package: floom/prompt/templates/default
      system: "Summarize concisely, do not use political correctness" 
        
    # ❌ Filter Profanity   
    validation:
      - package: floom/plugins/bad-words-filter
        disallow: ['profanity']
        
  response:
    # ✔️ Response should be 2 sentences max, in German 
    format:
     - package: floom/response/formatter 
       type: text
       language: de
       max-sentences: 2
        
    # ❌ Filter Profanity & Negativity
    validation:
     - package: floom/plugins/bad-words-filter 
       disallow: ['profanity', 'negativity']
  
  global:
    # 💵 Cost limit
    - package: floom/plugins/cost-management 
      limits:
        user:
         - day: 1000 # max tokens per day
        pipeline:
         - month: 600000 # max tokens per month
			
    # ⚡Caching
    - package: floom/plugins/cache 
      cache-type: memory

kind: 'floom/pipeline/1.2'

pipeline:
  name: my-first-pipeline
  
  # Use 🧠 Whisper (speech-to-text) on AWS
  model:
    - package: floom/model/connectors/openai
      model: whisper
      endpoint: 31.61.22.2:6060
      
  response:
    # ✔️ Response should be in German (text)
    format:
     - package: floom/response/formatter 
       type: text
       language: de
        
    # ❌ Filter Profanity & Negativity
    validation:
     - package: floom/plugins/bad-words-filter 
       disallow: ['profanity', 'negativity']
  
  global:
    # 💵 Cost limit
    - package: floom/plugins/cost-management 
      limits:
        user:
         - day: 1000 # max tokens per day
        pipeline:
         - month: 600000 # max tokens per month
			
    # ⚡Caching
    - package: floom/plugins/cache 
      cache-type: memory
deploy
Deploy

# Install ⌨️ Floom CLI
brew install floom

# ☁️ Floom Cloud
floom deploy cloud my-first-pipeline.yaml 

# 🐳 Floom Docker (Local)
floom deploy local my-first-pipeline.yaml

# 🐳 Floom Docker (Remote)
floom deploy -endpoint=[YOUR-FLOOM-GATEWAY-ADDRESS] my-first-pipeline.yaml

# Install ⌨️ Floom CLI
winget install floom

# ☁️ Floom Cloud
floom deploy cloud my-first-pipeline.yaml 

# 🐳 Floom Docker (Local)
floom deploy local my-first-pipeline.yaml

# 🐳 Floom Docker (Remote)
floom deploy -endpoint=[YOUR-FLOOM-GATEWAY-ADDRESS] my-first-pipeline.yaml

# Install ⌨️ Floom CLI
sh -c "$(curl -fsSL https://get.floom.ai/install.sh)"

# ☁️ Floom Cloud
floom deploy cloud my-first-pipeline.yaml 

# 🐳 Floom Docker (Local)
floom deploy local my-first-pipeline.yaml

# 🐳 Floom Docker (Remote)
floom deploy -endpoint=[YOUR-FLOOM-GATEWAY-ADDRESS] my-first-pipeline.yaml
RUN
Run

from FloomClient import FloomClient

client = FloomClient("my-first-pipeline.floom.ai")

client.run(
   prompt="How do I reset the oil alert in my dashboard?"
)

import { FloomClient } from '../FloomClient';

const floomClient = new FloomClient();

await floomClient.run(
      "my-first-pipeline",
      "How do I reset the oil alert in my dashboard?"
    );

var floomClient = new FloomClient();

await floomClient.Run(
       pipeline: "my-first-pipeline",
       prompt: "How do I reset the oil alert in my dashboard?"
   );

var floomClient = new FloomClient();

await floomClient.Run(
       pipeline: "my-first-pipeline",
       prompt: "How do I reset the oil alert in my dashboard?"
   );

from FloomClient import FloomClient

client = FloomClient().run(
   pipeline="my-first-pipeline",
   prompt="How do I reset the oil alert in my dashboard?"
)

from FloomClient import FloomClient

client = FloomClient(
   endpoint="my-first-pipeline.floom.ai",
).run(
   prompt="How do I reset the oil alert in my dashboard?")
Text + Code

Generate text or code, configure safeguards, prompts and responses.

Images

Generate images, finalize quality, resolution and more.

Audio

Generate audio using cutting-edge AI engines.

Speech

Generate or transcribe speech.

Features

Deploy AI with Confidence

Grow Reliably & Limitlessly

Robust and predictable AI processing, featuring failsafe mechanisms and caching.

Keep Your Data

Floom is a Docker container in your cloud, ensuring your data remains exclusively in your control.

Build Beyond Tomorrow

Effortlessly enjoy AI innovation with Floom's self-updating, decoupled API architecture.

Secure Your AI

Built-in security: DLP, PII Masking, Anonymization, DDoS Defense, RCE Prevention and more.

Control Costs

Optimize cost per session, user, or pipeline, switch to cost-effective models and leverage caching.

Collaborate to Succeed

Tailored for Devs, managed by DevOps and fortified by Security for seamless, cross-functional collaboration

Got questions?

Frequently Asked Questions

What infrastructure do I need?
How do I start using Floom?
Does Floom support CI/CD?
Does Floom support IaC?
What about pricing?
How does Floom keep my data safe?
Can Floom train/query my data?
What about compliance & regulation (GDPR/PCI DSS/HIPAA)?
How do I integrate Floom in my code?
Are you ready?
Get Started