JERRYLSU
  • Home
  • Categories
  • Tags
  • Archives

NLP Resources

Contents

show

Paper

Effective Approaches to Attention-based Neural Machine Translation

Attention is All You Need

BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension

RASA: Dialogue Transformers

Project

Transformers

Microsoft NLP best practices: Best practices and examples on NLP.

RASA

Share on: Twitter ❄ Facebook ❄ Email

Comments

Comments

Related Posts

  • LLM Algorithms
  • DocTuning
  • Universal Chart Structural Multimodal Generation and Extraction
  • MinHash: Document-level Deduplication
  • GGUF Model

  • « Pytorch Transformer
  • Deploy model in production »

Published

Jul 26, 2020

Category

NLP

Tags

  • NLP 34

Contact

Powered by: Pelican Theme: Elegant