Calculate token/s & GPU memory requirement for any LLM. Supports llama.cpp/ggml/bnb/QLoRA quantization
-
Updated
Nov 4, 2023 - JavaScript
Calculate token/s & GPU memory requirement for any LLM. Supports llama.cpp/ggml/bnb/QLoRA quantization
Browser-compatible JS library for running language models
A browser extension that lets you chat with YouTube videos using Llama2-7b. Built using 🤗 Inference Endpoints and Vercel's AI SDK.
A Stable Diffusion Demo with React and FastAPI
AGI using nodeJS and ECS
AI Makerspace: Blueprints for developing machine learning applications with state-of-the-art technologies.
Hugging Face's Zapier Integration 🤗⚡️
Application for playing Quickdraw powered by a Vision Transformer
Detecting Fake News using AI
A universal Qdrant table frontend based on transformers.js
A novel approach for security and user experience of Graphical Password Authentication.
A game where you need to guess whether a tweet comes from a human, or from a neural network language model trained on a category of tweets.
A simple NPM interface for seamlessly interacting with 36 Large Language Model (LLM) providers, including OpenAI, Anthropic, Google Gemini, Cohere, Hugging Face Inference, NVIDIA AI, Mistral AI, AI21 Studio, LLaMA.CPP, and Ollama, and hundreds of models.
Simply input any YouTube video URL, and watch in awe as AI analyzes the content and provides answers to your questions in real-time. 🤯 Study smarter, save time, and unlock a whole new level of video interaction!
Experimental tl;dr summaries for datasets on the Hugging Face Hub!
Generate emojis from Image..
Out-of-the-box text emotion / sentiment analysis application, supports Chinese and English.
Document chat using Web LLM and Transformers.js embeddings
A simple node.js example that generates an image using StableDiffusion via Hugging Face Inference API.
tweet sentiment thing
Add a description, image, and links to the huggingface topic page so that developers can more easily learn about it.
To associate your repository with the huggingface topic, visit your repo's landing page and select "manage topics."