Skip to main content
Glama
orneryd

M.I.M.I.R - Multi-agent Intelligent Memory & Insight Repository

by orneryd
ggml-opencl.h581 B
#ifndef GGML_OPENCL_H #define GGML_OPENCL_H #include "ggml.h" #include "ggml-backend.h" #ifdef __cplusplus extern "C" { #endif // // backend API // GGML_BACKEND_API ggml_backend_t ggml_backend_opencl_init(void); GGML_BACKEND_API bool ggml_backend_is_opencl(ggml_backend_t backend); GGML_BACKEND_API ggml_backend_buffer_type_t ggml_backend_opencl_buffer_type(void); GGML_BACKEND_API ggml_backend_buffer_type_t ggml_backend_opencl_host_buffer_type(void); GGML_BACKEND_API ggml_backend_reg_t ggml_backend_opencl_reg(void); #ifdef __cplusplus } #endif #endif // GGML_OPENCL_H

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/orneryd/Mimir'

If you have feedback or need assistance with the MCP directory API, please join our Discord server