- Hamburg
Highlights
- Pro
Stars
Fine-grained parallelism with sub-nanosecond overhead in Zig
A massively parallel, high-level programming language
BlackHole is a modern macOS audio loopback driver that allows applications to pass audio to other applications with zero additional latency.
A macOS application to run local models like Mistral, with the option to enhance conversations by switching to GPT-4.
Official Code for Stable Cascade
A Zig and Swift PoC for lightweight, resource-efficient native macOS apps.
All-in-one desktop app for running LLMs locally.
DeepSeekMath: Pushing the Limits of Mathematical Reasoning in Open Language Models
A simple vector store written in TypeScript with a WASM backend.
Letta (formerly MemGPT) is a framework for creating LLM services with memory.
A fast inference library for running LLMs locally on modern consumer-class GPUs
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Visualizer for neural network, deep learning and machine learning models
Take 3D stereoscopic screenshots in the visionOS emulator.
A reactive state machine implementation for OpenAI Chat Completions API.
CSS trick/bug to display a brighter white by exploiting browsers' HDR capability and Apple's EDR system
display a very bright white color on HDR-enabled displays
Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
An esbuild plugin that generates an HTML file.
TypeScript for Tiny IoT Devices (ESP32, RP2040, ...)