Linyi Yang

Linyi Yang

Postdoctoral Associate

Insight Centre, University College Dublin

Biography

I’m a Postdoctoral Associate in the Westlake NLP group, working with Yue Zhang. Previously, I graduated with a PhD from the Insight Centre, University College Dublin, where I worked with Barry Smyth and Ruihai Dong.

I am broadly interested in the problem of eXplainable Artificail Intelligence (XAI), improving generalization of neural networks for the real natural language understanding, and their applications in the financial domain.

Interests

  • Natural Language Processing
  • Financial Forecasting
  • Casual Inference
  • Explainable AI (XAI)

Education

  • PhD in Artificial Intelligence, 2017-2021

    University College Dublin

  • MSc in Artificial Intelligence, 2017

    University College Dublin

  • BSc in Computer Science, 2016

    Harbin Engineering University

News

  • 2022-Mar: One co-first author long paper has been accepted by ACL 2022 main conference. This is my third CCF-A paper published at Westlake University with the great help by Yue Zhang. (core: A*, CCF: A)
  • 2021-Dec: One first-author long paper has been accepted by AAAI 2022 (15% acceptance rate). (core: A*, CCF: A)
  • It’s my great honor to serve as an Area Chair (AC) at EMNLP-22!
  • PC Member/Reviewer: CIKM-20; COLING-20; ACL-21; SIGIR-21; CKIM-21; EMNLP-21; IEEE-Access
  • We start working on tracking the progress in the topic of FinNLP. Feel free to add any relevant items to Project Link
  • 2021-May: One first-author long paper has been accepted by ACL 2021 (21% acceptance rate). (core: A*, CCF: A)
  • 2020-Oct: One first-author long paper has been accepted by COLING 2020 (Top 5% submissions). (core: A, CCF: B)
  • 2020-Sep: One co-first author resource paper has been accepted by CIKM 2020 (20% acceptance rate). (core: A, CCF: B)
  • 2019-Dec: One first-author long paper has been accepted by WWW 2020 (19% acceptance rate). (core: A*, CCF: A)
  • 2019-Aug: One first-author long paper has been accepted by FinNLP Workshop@IJCAI-19 (Oral).
  • 2018-Nov: Our paper won the best paper nomitation at CCIS 2018 (Best Paper Candiadate).
  • 2017-Dec: My first paper was published at AICS 2017.

Recent Publications

Quickly discover relevant papers by filtering publications.

NumHTML: Numeric-Oriented Hierarchical Transformer Model for Multi-task Financial Forecasting

This paper describes a numeric-oriented hierarchical transformer model (NumHTML) to predict stock returns, and financial risk using multi-modal aligned earnings calls data by taking advantage of the different categories of numbers (monetary, temporal, percentages etc.) and their magnitude.

Deep Neural Approach for Financial Analysis

We demonstrate the success of deep learning methods in modeling unstructured data for financial applications, including explainable deep learning models, multi-modal multi- task learning frameworks, and counterfactual generation systems for explanations and data augmentations.

Exploring the Efficacy of Automatically Generated Counterfactuals for Sentiment Analysis

While state-of-the-art NLP models have been achieving the excellent performance of a wide range of tasks in recent years, important questions are being raised about their robustness and their underlying sensitivity to systematic biases that may exist in their training and test data.

Generating Plausible Counterfactual Explanations for Deep Transformers in Financial Text Classification

This paper proposes a novel methodology for producing plausible counterfactual explanations, whilst exploring the regularization benefits of adversarial training on language models in the domain of FinTech. Exhaustive quantitative experiments demonstrate that not only does this approach improve the model accuracy when compared to the current state-of-the-art and human performance, but it also generates counterfactual explanations which are significantly more plausible based on human trials.

MAEC: A Multimodal Aligned Earnings Conference Call Dataset for Financial Risk Prediction

We present the approach used in this work as providing a suitable framework for processing similar forms of data in the future. The resulting dataset is more than six times larger than those currently available to the research community and we discuss its potential in terms of current and future research challenges and opportunities.

HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction

This paper proposes a novel hierarchical, transformer, multi-task architecture designed to harness the text and audio data from quarterly earnings conference calls to predict future price volatility in the short and long term. This includes a comprehensive comparison to a variety of baselines, which demonstrates very significant improvements in prediction accuracy, in the range 17% - 49% compared to the current state-of-the-art.

Leveraging BERT to Improve the FEARS Index for Stock Forecasting

In this paper, we take into account the semantics of the FEARS search terms by leveraging the Bidirectional Encoder Representations from Transformers (BERT), and further apply a self-attention deep learning model to our refined FEARS seamlessly for stock return prediction.

Explainable Text-Driven Neural Network for Stock Prediction (*Best Paper Nominated)

We use an output attention mechanism to allocate different weights to different days in terms of their contribution to stock price movement. Thorough empirical studies based upon historical prices of several individual stocks demonstrate the superiority of our proposed method in stock price prediction compared to state-of-the-art methods

Multi-level Attention-Based Neural Networks for Distant Supervised Relation Extraction

We propose a dual-level attention mechanism for the relation extraction problem

Multi-level Attention-Based Neural Networks for Distant Supervised Relation Extraction

We propose a dual-level attention mechanism for the relation extraction problem

Recent Posts

Recent & Upcoming Talks