top of page

Analysis: Feature Importance by Time Steps Before Prediction

Updated: Apr 19, 2023

The team was able to extract information on feature importance by time steps before a prediction be examine the raw predictions output from Temporal Fusion Transformer (TFT) model from the Pytorch Forecasting library. A visualization of such information is presented in the interactive graph below. Noted the x-axis, from left to right, goes from the time step that is furthest away from the prediction (i.e. the oldest time step) to the most recent time step, which in this study is just one hour before the predicted time t.


Below is the example using a prediction from January in Australia using a GPP-TFT model with 14 days of encoder length.

Double click on the item in legend to isolate the feature from rest of the graph. Adjust the bars in the timeline below the zoom in or out of the data.





 







90 views0 comments

Recent Posts

See All

Analysis: Attention by IGBP Group

The below visual shows the encoder attention vector for each IGBP group from the No-GPP-TFT model. User can toggle which groups to display by clicking on the legend

About the Data

Initial Dataset The GPP measurements collected by the flux towers are shared voluntarily. Hence, even though there around couple hundreds of flux towers around the world, only around 260 flux towers a

bottom of page