ml-pathogenic-microbesIdentify and enhance an existing state-of-the-art deep learning model for time series data to provide increased prediction accuracy and performance for analysis of irregular multivariate time series data. The models shall be able to input multiple input variables and identify their interactions and co-movements to predict an output variable, while handling characteristics of data irregularity including different sampling mechanisms (i.e. event driven and periodic) and sampling rates. The enhanced model shall be able to better utilise temporal information in the time series data (e.g. sampling intervals) for identification of short and long term pattern occurrence and result in increased prediction accuracy of the target output variable.
Principal investigatorPhilip Weerakody email@example.com
Area of scienceComputer Science
Applications usedPython Machine Learning
In recent times, deep learning models have become one of the leading methods of modelling time series data and a number of network types, such as LSTM and CNN have been demonstrating exceptional forecasting performance in comparison to many widely used conventional time series models. Yet, despite the significant evolution in time series modelling from simple linear regressions to present day cutting edge deep learning networks, most models still struggle to accurately forecast values for inputs with irregular data and the associated level of research focusing on optimisation of models for handling irregular data is relatively low given the prevalence of this type of data.
This research intends to address this gap by taking the LSTM model, one of the leading deep learning architectures for handling irregular multivariate time series data and build upon recent research by modifying its internal gate structure and associated update functions to improve predictive accuracy when faced with several forms of irregular data. The modified LSTM shall be compared not only to classical time series models, the base LSTM but also to other state of the art deep learning architectures.
Model testing shall be performed on large data sets using the Pawsey Supercomputing Centre’s – Nimbus – to evaluate performance of the enhanced LSTM model.