{"id":7469,"date":"2026-02-02T11:02:03","date_gmt":"2026-02-02T11:02:03","guid":{"rendered":"https:\/\/pjsmith.me.uk\/?post_type=docs&#038;p=7469"},"modified":"2026-02-02T11:02:10","modified_gmt":"2026-02-02T11:02:10","slug":"requirements","status":"publish","type":"docs","link":"https:\/\/pjsmith.me.uk\/index.php\/docs\/pjsprofilebars\/general-feature-notes\/situational-awareness\/requirements\/","title":{"rendered":"Requirements"},"content":{"rendered":"\n<p>The Situational Awareness features and LLM integration in pjsProfilebars require an OpenAI-compatible endpoint to talk to. This can be provided by an <a href=\"https:\/\/pjsmith.me.uk\/index.php\/docs\/pjsprofilebars\/general-feature-notes\/situational-awareness\/setting-up-ollama\/\" data-type=\"docs\" data-id=\"7429\">Ollama-compatible<\/a> endpoint, installed locally, or any other compatible endpoint, local or remote, including those offered by third-party providers such as OpenRoute, OpanAI, Groq, etc. This also enables you to use models of a complexity that would not be possible for most users to run locally.<\/p>\n\n\n\n<p>Configuration in pjsProfilebars requires an endpoint URL for the API (defaults to local Ollama), a model name, and an API key if your provider (non-Ollama) requires it.<\/p>\n\n\n\n<p>Here are some example configurations<\/p>\n\n\n\n<p>&nbsp; Local Ollama (default):<\/p>\n\n\n\n<p>&nbsp; &#8211; API URL: http:\/\/127.0.0.1:11434\/v1\/chat\/completions<\/p>\n\n\n\n<p>&nbsp; &#8211; API Key: (leave blank)<\/p>\n\n\n\n<p>&nbsp; &#8211; Model: llama3.1, gemma3:4b-it-qat, etc.<\/p>\n\n\n\n<p>&nbsp; OpenRouter:<\/p>\n\n\n\n<p>&nbsp; &#8211; API URL: https:\/\/openrouter.ai\/api\/v1\/chat\/completions<\/p>\n\n\n\n<p>&nbsp; &#8211; API Key: sk-or-v1-your-key-here<\/p>\n\n\n\n<p>&nbsp; &#8211; Model: anthropic\/claude-3.5-sonnet, meta-llama\/llama-3.1-70b-instruct, etc.<\/p>\n\n\n\n<p>&nbsp; OpenAI:<\/p>\n\n\n\n<p>&nbsp; &#8211; API URL: https:\/\/api.openai.com\/v1\/chat\/completions<\/p>\n\n\n\n<p>&nbsp; &#8211; API Key: sk-your-openai-key<\/p>\n\n\n\n<p>&nbsp; &#8211; Model: gpt-4o, gpt-4-turbo, etc.<\/p>\n\n\n\n<p>&nbsp; Groq:<\/p>\n\n\n\n<p>&nbsp; &#8211; API URL: https:\/\/api.groq.com\/openai\/v1\/chat\/completions<\/p>\n\n\n\n<p>&nbsp; &#8211; API Key: your Groq API key<\/p>\n\n\n\n<p>&nbsp; &#8211; Model: llama-3.1-70b-versatile, etc.<\/p>\n","protected":false},"featured_media":0,"parent":7427,"menu_order":1,"comment_status":"open","ping_status":"closed","template":"","doc_tag":[],"class_list":["post-7469","docs","type-docs","status-publish"],"jetpack_sharing_enabled":true,"jetpack_likes_enabled":true,"comment_count":0,"_links":{"self":[{"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/docs\/7469","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/docs"}],"about":[{"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/types\/docs"}],"replies":[{"embeddable":true,"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/comments?post=7469"}],"version-history":[{"count":1,"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/docs\/7469\/revisions"}],"predecessor-version":[{"id":7470,"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/docs\/7469\/revisions\/7470"}],"up":[{"embeddable":true,"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/docs\/7427"}],"prev":[{"title":"Setting up Ollama","link":"https:\/\/pjsmith.me.uk\/index.php\/docs\/pjsprofilebars\/general-feature-notes\/situational-awareness\/setting-up-ollama\/","href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/docs\/7429"}],"wp:attachment":[{"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/media?parent=7469"}],"wp:term":[{"taxonomy":"doc_tag","embeddable":true,"href":"https:\/\/pjsmith.me.uk\/index.php\/wp-json\/wp\/v2\/doc_tag?post=7469"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}