Tianlong Li's Projects
算法工程师面试题整理
🧑🏫 59 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
An experimental open-source attempt to make GPT-4 fully autonomous.
大麦网抢票脚本
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A curated list of security-related papers, articles, and resources focused on Large Language Models (LLMs). This repository aims to provide researchers, practitioners, and enthusiasts with insights into the security implications, challenges, and advancements surrounding these powerful models.
Summarize existing representative LLMs text datasets.
TensorFlow code and pre-trained models for BERT
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Challenging BIG-Bench Tasks and Whether Chain-of-Thought Can Solve Them
Chat凉宫春日, 由李鲁鲁, 冷子昂等同学开发的模仿二次元对话的聊天机器人。
Chinese version of CLIP which achieves Chinese cross-modal retrieval and representation generation.
Contrastive Language-Image Pretraining
Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch
Code for CRATE (Coding RAte reduction TransformEr).
Deep Learning papers reading roadmap for anyone who are eager to learn this amazing tech!
This is a simple implementation of BERT and other models on GLUE datasets using the Pytorch and Transformers libraries(BERT等模型在GLUE数据集上的微调)
金庸小说三联版
Unify Efficient Fine-tuning of 100+ LLMs
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
基于NanoDet项目进行小裁剪,专门用来实现PyTorch 版本的代码,下载直接能使用,支持图片、视频文件、摄像头实时目标检测。
A pipeline to improve skills of large language models
👑 Easy-to-use and powerful NLP library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search, ❓ Question Answering, ℹ️ Information Extraction, 📄 Document Intelligence, 💌 Sentiment Analysis and 🖼 Diffusion AICG system etc.
PyTorch implementation of the U-Net for image semantic segmentation with high quality images
Representation Engineering: A Top-Down Approach to AI Transparency
RWKV is a RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.