Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
ggml-webgpu.h328 B
#pragma once #include "ggml.h" #include "ggml-backend.h" #ifdef __cplusplus extern "C" { #endif #define GGML_WEBGPU_NAME "WebGPU" // Needed for examples in ggml GGML_BACKEND_API ggml_backend_t ggml_backend_webgpu_init(void); GGML_BACKEND_API ggml_backend_reg_t ggml_backend_webgpu_reg(void); #ifdef __cplusplus } #endif

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server