About me

I am a 4th year PhD student under the supervision of Maarten de Rijke and Christof Monz at ILPS, University of Amsterdam. I am interested in ML and NLP, especially Natural Language Generation and Open-domain Dialogue Systems (Chatbots 🤖). I also take photos as a hobby 📸.


  • Artificial Intelligence
  • Natural Language Processing
  • Computational Linguistics
  • Information Retrieval
  • Photography


  • PhD Student, 2017-2021 (expected)

    University of Amsterdam, The Netherlands

  • Master of Engineering, 2014-2017

    Northwest A&F University, Yangling, China

  • Bachelor of Engineering, 2010-2014

    Northwest A&F University, Yangling, China
















Applied Science Intern


Jul 2020 – Sep 2020 Amsterdam, the Netherlands
This was a remote internship as my travel to the originally agreed location Berlin was interrupted by COVID-19 pandemic. During this period, I worked on a review summarization task with the Subjective NLP team (based mostly in Barcelona and partly in Berlin).

NLP Summit


Jun 2019 – Jun 2019 Zurich, Switzerland
  • Thanks to Google for paying everything
  • Great opportunity to communicate with peers and Googlers
  • Presented my WWW ‘19 poster
  • Nice crystal clear water everywhere!

Poster presentation at TheWebConf

TheWebConf 2019

May 2019 – May 2019 San Francisco, California
  • Presented my poster for our paper accepted at TheWebConf ‘19
  • Thanks to all the coauthors!
  • Superb coastal views!
  • The Golden Gate Bridge is glorious!

Poster presentation at SCAI Workshop

EMNLP 2018

Oct 2018 – Nov 2018 Brussels, Belgium
  • Presented my poster for our paper accepted at SCAI Workshop
  • Awarded student travel grant (€400)
  • A nice city with Dutch-style and French-style (and more) architectures merged together

PhD student

University of Amsterdam

Oct 2017 – Present Amsterdam, The Netherlands

Master of Engineering

Northwest A&F University

Sep 2014 – Jun 2017 Yangling, China
  • Thesis title: Research on Feature Representation and Optimization Methods in Structured Object Tracking
  • Supervisor: Jifeng Ning

Bachelor of Engineering

Northwest A&F University

Sep 2010 – Jun 2014 Yangling, China
  • Thesis title: Implementation of Single Image Haze Removal Using Dark Channel Prior
  • Advisors: Dr. Yaojun Geng and Prof. Jifeng Ning

Recent Posts

Transformer Align Model

Jointly Learning to Align and Translate with Transformer Models

Compressive Transformers

Built on top of Transformer-XL, Compressive Transformer1 condenses old memories (hidden states) and stores them in the compressed memory buffer, before completely discarding them. This model is suitable for long-range sequence learning but may cause too much computational burden for tasks that only have short sequences.

Visualizing the Loss Landscape of Neural Nets

What characterizes a easier to train, easier to generalize neural model?

Adaptive Computation Time

My notes for the paper: Adaptive Computation Time for Recurrent Neural Networks1. Additive vs multiplicative halting probability Multiplicative: In the paper (footnote 1), the authors discuss throughly their considerations for deciding the computation time.

A Hub for Transformer Blogs and Papers

This is a growing list of pointers to useful blog posts and papers related to transformers. Transformers explained Blog: The Illustrated Transformer has many intuitive animations of how transformer models work Blog: Universal Transformers introduces the idea of recurrence among layers Blog: Transformer vs RNN and CNN for Translation Task GNNs: similarities and differences Blog: Transformers are Graph Neural Networks bridges transformer models and Graph Neural Networks Transformer improvements Blog: DeepMind Releases a New Architecture and a New Dataset to Improve Long-Term Memory in Deep Learning Systems Nural Turing Machine + transformer?


  • s.jiang AT uva DOT nl
  • Science Park 608B, Amsterdam, North Holland 1098 XH
  • Mint green building, ground floor, room 0.07.
  • Mon 10:00 to 18:00
    Tue 10:00 to 18:00
    Wed 10:00 to 18:00
    Thu 10:00 to 18:00
    Fri 10:00 to 18:00
  • Twitter