Name: Fuqiang Gu
Type: User
Company: National University of Singapore
Bio: I am a researcher at the National University of Singapore. Working mobile sensing, activity recognition, machine learning, and robots
Location: Singapore
Fuqiang Gu's Projects
3D Dense Connected Convolutional Network (3D-DenseNet for action recognition)
Activity recognition using mobile/wearable devices such as smartphones.
Chainer implementation of adversarial autoencoder (AAE)
Open source simulator for autonomous vehicles built on Unreal Engine / Unity, from Microsoft AI & Research
automatic video description generation with GPU training
A PyTorch implementation of the Transformer model in "Attention is All You Need".
"Attention in Convolutional LSTM for Gesture Recognition" in NIPS 2018
Pervasive Attention: 2D Convolutional Networks for Sequence-to-Sequence Prediction
A comprehensive list of pytorch related content on github,such as different models,implementations,helper libraries,tutorials etc.
BACKpropagation PACKage - A backpack for PyTorch that extends the backward pass of feedforward networks to compute quantities beyond the gradient
Notebooks related to Bayesian methods for machine learning
Pytorch implementation of β-VAE
Spatio-temporal BP for SNNs
Disentangled Variational Auto-Encoder in TensorFlow / Keras (Beta-VAE)
C3D is a modified version of BVLC caffe to support 3D ConvNets.
Pytorch porting of C3D network, with Sports1M weights
92.45% on CIFAR-10 in Torch
Public facing notes page
pytorch code of CSI-Net, including data and pre-trained models
Code for the 2018 EMNLP Interpretability Workshop Paper "Interpreting Neural Networks with Nearest Neighbors"
Mobile sensing
An Indoor Fingerprinting Localization Algorithm which uses CSI from wifi
Implementation of some deep learning algorithms.
A PyTorch implementation of DenseNet.
The DESPOT online POMDP solver
Tutorials for using DESPOT with ROS
Variational Autoencoder for Unsupervised and Disentangled Representation Learning of content and motion features in sequential data (Mandt et al.).