1. Home
  2. Docs
  3. pjsProfileBars
  4. General feature notes
  5. Situational Awareness
  6. Requirements

Requirements

The Situational Awareness features and LLM integration in pjsProfilebars require an OpenAI-compatible endpoint to talk to. This can be provided by an Ollama-compatible endpoint, installed locally, or any other compatible endpoint, local or remote, including those offered by third-party providers such as OpenRoute, OpanAI, Groq, etc. This also enables you to use models of a complexity that would not be possible for most users to run locally.

Configuration in pjsProfilebars requires an endpoint URL for the API (defaults to local Ollama), a model name, and an API key if your provider (non-Ollama) requires it.

Here are some example configurations

  Local Ollama (default):

  – API URL: http://127.0.0.1:11434/v1/chat/completions

  – API Key: (leave blank)

  – Model: llama3.1, gemma3:4b-it-qat, etc.

  OpenRouter:

  – API URL: https://openrouter.ai/api/v1/chat/completions

  – API Key: sk-or-v1-your-key-here

  – Model: anthropic/claude-3.5-sonnet, meta-llama/llama-3.1-70b-instruct, etc.

  OpenAI:

  – API URL: https://api.openai.com/v1/chat/completions

  – API Key: sk-your-openai-key

  – Model: gpt-4o, gpt-4-turbo, etc.

  Groq:

  – API URL: https://api.groq.com/openai/v1/chat/completions

  – API Key: your Groq API key

  – Model: llama-3.1-70b-versatile, etc.

How can we help?