Code Monkey home page Code Monkey logo

wolfhu's Projects

abcnn icon abcnn

Implmentiaion of ABCNN(Attention-Based Convolutional Neural Network) on Tensorflow

adda_mnist64 icon adda_mnist64

Adversarial Discriminative Domain Adaptation with MNIST 64x64 in Lasagne-Theano

adgat icon adgat

Modeling the Momentum Spillover Effect for Stock Prediction via Attribute-Driven Graph Attention Networks

adv-alstm icon adv-alstm

Code for paper "Enhancing Stock Movement Prediction with Adversarial Training" IJCAI 2019

adversarial icon adversarial

Code and hyperparameters for the paper "Generative Adversarial Networks"

agriculture_knowledgegraph icon agriculture_knowledgegraph

农业知识图谱(KG):农业领域的信息检索,命名实体识别,关系抽取,分类树构建,数据挖掘

allrank icon allrank

allRank is a framework for training learning-to-rank neural models based on PyTorch.

anago icon anago

Bidirectional LSTM-CRF for Sequence Labeling. Easy-to-use and state-of-the-art performance.

anyq icon anyq

FAQ-based Question Answering System

app-dl icon app-dl

Deep Learning and applications in Startups, CV, Text Mining, NLP

arena-baselines icon arena-baselines

Arena: A General Evaluation Platform and Building Toolkit for Single/Multi-Agent Intelligence. AAAI 2020.

attentionxml icon attentionxml

Implementation for "AttentionXML: Label Tree-based Attention-Aware Deep Model for High-Performance Extreme Multi-Label Text Classification"

audiolm-pytorch icon audiolm-pytorch

Implementation of AudioLM, a SOTA Language Modeling Approach to Audio Generation out of Google Research, in Pytorch

baselines icon baselines

OpenAI Baselines: high-quality implementations of reinforcement learning algorithms

began icon began

Boundary Equibilibrium Generative Adversarial Networks Implementation in Tensorflow

bert-as-service icon bert-as-service

Mapping a variable-length sentence to a fixed-length vector using BERT model

bert-kbqa-nlpcc2017 icon bert-kbqa-nlpcc2017

A trial of kbqa based on bert for NLPCC2016/2017 Task 5 (基于BERT的中文知识库问答实践,代码可跑通)

bertviz icon bertviz

Tool for visualizing attention in the Transformer model (BERT, GPT-2, XLNet, RoBERTa, CTRL, etc.)

bi-att-flow icon bi-att-flow

Bi-directional Attention Flow (BiDAF) network is a multi-stage hierarchical process that represents context at different levels of granularity and uses a bi-directional attention flow mechanism to achieve a query-aware context representation without early summarization.

biblosa icon biblosa

Bi-Directional Block Self-Attention

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.