Lists (2)
Sort Name ascending (A-Z)
Stars
A curated list for Efficient Large Language Models
AutoAWQ implements the AWQ algorithm for 4-bit quantization with a 2x speedup during inference. Documentation:
[MLSys 2024 Best Paper Award] AWQ: Activation-aware Weight Quantization for LLM Compression and Acceleration
4 bits quantization of LLaMA using GPTQ
QLoRA: Efficient Finetuning of Quantized LLMs
《Pytorch实用教程》(第二版)无论是零基础入门,还是CV、NLP、LLM项目应用,或是进阶工程化部署落地,在这里都有。相信在本书的帮助下,读者将能够轻松掌握 PyTorch 的使用,成为一名优秀的深度学习工程师。
We jailbreak GPT-3.5 Turbo’s safety guardrails by fine-tuning it on only 10 adversarially designed examples, at a cost of less than $0.20 via OpenAI’s APIs.
A framework to evaluate the generalization capability of safety alignment for LLMs
A curation of awesome tools, documents and projects about LLM Security.
A reading list for large models safety, security, and privacy (including Awesome LLM Security, Safety, etc.).
[Arxiv] Aligning Modalities in Vision Large Language Models via Preference Fine-tuning
i. A practical application of Transformer (ViT) on 2-D physiological signal (EEG) classification tasks. Also could be tried with EMG, EOG, ECG, etc. ii. Including the attention of spatial dimension…
Ecg classification using transformers
Recent LLM-based CV and related works. Welcome to comment/contribute!
A ViT based transformer applied on multi-channel time-series EEG data for motor imagery classification
Official implementation of paper "Cumulative Reasoning With Large Language Models" (https://arxiv.org/abs/2308.04371)
MIT-BIH ECG recognition using 1d CNN with TensorFlow2 and PyTorch
Application of deep learning and convolutional networks for ECG classification
Deep learning ECG models implemented using PyTorch
Code for the AAAI 2024 Oral paper "OWQ: Outlier-Aware Weight Quantization for Efficient Fine-Tuning and Inference of Large Language Models".
本项目旨在分享大模型相关技术原理以及实战经验(大模型工程化、大模型应用落地)
Learning Large Language Model (LLM)(大语言模型学习)
A professionally curated list of awesome resources (paper, code, data, etc.) on transformers in time series.