Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Deep generative models for time series counterfactual inference
(USC Thesis Other)
Deep generative models for time series counterfactual inference
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
DEEP GENERATIVE MODELS FOR TIME SERIES COUNTERFACTUAL INFERENCE by Guangyu Li A Dissertation Presented to the FACULTY OF THE USC GRADUATE SCHOOL UNIVERSITY OF SOUTHERN CALIFORNIA In Partial Fulfillment of the Requirements for the Degree DOCTOR OF PHILOSOPHY (ELECTRICAL ENGINEERING) December 2021 Copyright 2021 Guangyu Li Acknowledgments First and foremost I am very grateful to my advisor Yan Liu. A discussion with Yan in August 2015 brought me on to this journey. Despite a busy schedule, you have always had time to discuss thoughts and ideas and I feel very fortunate to have been able to learn from you. A special thanks for always encouraging me to pursue my interests even though they were often outside the scope of our original plan. I have spent most of my time during this project at the Melady Lab at the University of Southern California and AI Labs in DiDi, both are amazing places to meet and learn from great people. Many thanks to my lab mates for many great discussions and in general providing an enjoyable environment. I have been fortunate enough to collaborate with too many great people to mention them all here, however a special thanks to Rose Yu for working together on my first research paper at USC and Zhengping Che, Bo Jiang for collaborations at DiDi. I hope that we can keep collaborating on new projects in the future. Lastly, I am grateful for my friends and family. Even though I thoroughly enjoyed my life in Los Angeles, I have always looked forward to coming home and seeing you all in Beijing. To my wife Junyi, you have always been incredibly patient and supportive during our Ph.D. journey. I am looking forward to what the future will bring to us and our new life in Beijing. ii Contents Acknowledgments ii List of Tables vii List of Figures viii Abstract xi I Introduction 1 1 Introduction 2 1.1 Background and Motivations . . . . . . . . . . . . . . . . . . . . . . . 2 1.2 Research Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 1.3 Mathematical Notations . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Counterfactual Inference 8 2.1 Potential Outcome Framework . . . . . . . . . . . . . . . . . . . . . . 8 2.2 Counterfactual Inference . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.2.1 Propensity Score Methods . . . . . . . . . . . . . . . . . . . . 10 2.2.2 Counterfactual Inference Models . . . . . . . . . . . . . . . . . 11 2.3 Counterfactual Inference for Time Series . . . . . . . . . . . . . . . . 12 3 Deep Generative Models 14 3.1 Latent Variable Models . . . . . . . . . . . . . . . . . . . . . . . . . . 14 iii 3.1.1 Generative Models . . . . . . . . . . . . . . . . . . . . . . . . 24 3.1.2 Inference Models . . . . . . . . . . . . . . . . . . . . . . . . . 25 3.1.3 Parameter Estimation . . . . . . . . . . . . . . . . . . . . . . 27 3.2 State Space Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 3.2.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.2.2 Posterior Inference in the Sequential Modeling . . . . . . . . . 33 3.2.2.1 Filtering . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.2.2.2 Smoothing . . . . . . . . . . . . . . . . . . . . . . . 34 3.2.2.3 Prediction . . . . . . . . . . . . . . . . . . . . . . . . 34 3.2.3 Types of State-space Models . . . . . . . . . . . . . . . . . . . 35 3.2.3.1 Discrete-time Models . . . . . . . . . . . . . . . . . . 35 3.2.3.2 Continuous-time Models . . . . . . . . . . . . . . . . 37 3.2.4 Summary and Discussion . . . . . . . . . . . . . . . . . . . . . 38 3.3 Deep State Space Models . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.3.1 Motivations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 3.3.2 Deep State-Space Models . . . . . . . . . . . . . . . . . . . . . 40 3.3.3 Connection to System Identification . . . . . . . . . . . . . . . 42 3.3.4 Summary and Discussion . . . . . . . . . . . . . . . . . . . . . 43 II Research Work 44 4 Research Statement 45 5 Counterfactual Inference for Time Series with Multi-agent Interac- tions 49 5.1 A Motivating Example . . . . . . . . . . . . . . . . . . . . . . . . . . 50 5.2 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 5.3 Background and Related Work . . . . . . . . . . . . . . . . . . . . . . 51 5.4 Generative Attentional Multi-Agent Network . . . . . . . . . . . . . . 54 5.4.1 Generation Network . . . . . . . . . . . . . . . . . . . . . . . 56 iv 5.4.2 Inference Network . . . . . . . . . . . . . . . . . . . . . . . . . 59 5.4.3 Parameter Learning . . . . . . . . . . . . . . . . . . . . . . . . 63 5.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 5.5.1 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64 5.5.2 Experimental Design . . . . . . . . . . . . . . . . . . . . . . . 66 5.5.3 Quantitative Results . . . . . . . . . . . . . . . . . . . . . . . 68 5.5.4 Interaction Analysis . . . . . . . . . . . . . . . . . . . . . . . 69 5.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 6 Counterfactual Inference for Time Series with Mixed Sampling Rates 72 6.1 A Motivating Example . . . . . . . . . . . . . . . . . . . . . . . . . . 72 6.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 6.3 Multi-rate Hierarchical Deep Markov Model . . . . . . . . . . . . . . 74 6.3.1 Generation Model . . . . . . . . . . . . . . . . . . . . . . . . . 77 6.3.2 Inference Network . . . . . . . . . . . . . . . . . . . . . . . . . 79 6.3.3 Learning the Parameters . . . . . . . . . . . . . . . . . . . . . 83 6.4 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 6.4.1 Datasets and Experimental Design . . . . . . . . . . . . . . . 85 6.4.2 Quantitative Results . . . . . . . . . . . . . . . . . . . . . . . 88 6.4.3 Quantitative Results . . . . . . . . . . . . . . . . . . . . . . . 89 6.4.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91 6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92 7 Continuous-time Time Series Counterfactual Inference with Aug- mented Differential Equations 93 7.1 A Motivating Example . . . . . . . . . . . . . . . . . . . . . . . . . . 94 7.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 7.3 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . 95 7.4 Augmented Counterfactual Ordinary Differential Equations . . . . . . 97 7.4.1 Augmented time series . . . . . . . . . . . . . . . . . . . . . . 98 v 7.4.2 Latent Counterfactual Differential Equations . . . . . . . . . . 99 7.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101 7.5.1 Tumor Growth Simulation . . . . . . . . . . . . . . . . . . . . 103 7.5.2 Intensive Care of Patients with Sepsis . . . . . . . . . . . . . . 107 7.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 8 Conclusion 110 Reference List 114 vi List of Tables 4.1 Features supported by existing methods for counterfactual inference . 47 5.1 Statistics of the processed Stanford Drone dataset. . . . . . . . . . . . 65 5.2 Behavior prediction results on 3 datasets. . . . . . . . . . . . . . . . . 67 5.3 Log-likelihood of generative models on 3 datasets. . . . . . . . . . . . 68 6.1 Forecasting results (MSE) on MIMIC-III. . . . . . . . . . . . . . . . . 90 6.2 Forecasting results (MSE) on USHCN. . . . . . . . . . . . . . . . . . 90 6.3 Interpolation results (MSE) on MIMIC-III and USHCN. . . . . . . . 91 7.1 Average RMSE×10 2 and standard error with 10 runs for the inference of sepsis patients on white blood cell count (WBC), blood pressure (BP), and oxygen saturation (OS) . . . . . . . . . . . . . . . . . . . . 108 vii List of Figures 1.1 A diagram of the data-driven time series counterfactual inference task. 6 2.1 Position of counterfactual inference in causal inference family. . . . . 10 3.1 Data generating process via neural networks. (Picture credit by OpenAI Inc.) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22 3.2 A graphical representation of a state-space model. . . . . . . . . . . . 30 4.1 A diagram of the data-driven time series counterfactual inference task. 45 5.1 Illustration of three multi-agent systems, including agent groups and interaction types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 5.2 Generation and inference network of GAMAN for the k-th agent in a K-agent system. Note that we only consider forward setting for the sake of clarity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 5.3 Visualization of learned agent vectors e t (left column) and interaction vectors s t (right column) on three datasets, indicating different agent groups and interaction types. . . . . . . . . . . . . . . . . . . . . . . 70 viii 6.1 Generation model and structured inference network (with the filtering setting) of our proposed MR-HDMM for MR-MTS. The switches on incoming edges to a node (z l t ) are the same, which is shown as s l t in Figure 6.2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 6.2 The switch mechanism for updating the latent states ztl in MR- HDMM. Left: The switch structure; Middle: Switch on (s l t = 1); Right: Switch off (s l t = 0). . . . . . . . . . . . . . . . . . . . . . . . . 77 6.3 Interpretable latent space learned by MR-HDMM model. (Upper) Hierarchical structure captured in the first 48 hours of an admission in MIMIC-III dataset by switch states of MR-HDMM. (Bottom) Hierar- chical structure (red & blue blocks) captured along with precipitation timeseries(greencurve)intheone-yearobservationinUSHCNdataset by switch states of MR-HDMM. . . . . . . . . . . . . . . . . . . . . . 89 7.1 The ACODE model with CDE-RNN encoder and CDE decoder. The CDE-RNN encoder first runs backwards-in-time to produce an approx- imate posterior over the initial latent stateq(z 0 |{x i ,t i } N i=1 , a ≤t ). Given a sample of z 0 and intervention process a(t), we can generate latent state at any point of interest, and further generate augmented time series observations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97 7.2 Normalized RMSE curve of counterfactual inference for treatment response on tumor growth. Dash lines represent three baselines, while solid lines represent ACODEs with different number of auxiliary variables. Although ACODEs support continuous-time prediction, we evaluate all methods at discrete timestamps for a fair comparison. . . 104 ix 7.3 TSNE visualization of learned auxiliary variables sequence z t for all three types of patients. As shown in the figure, learned auxiliary variables can be clustered into three groups corresponding to three patient types. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105 x Abstract Decision-makers want to know how to produce desired outcomes and act accord- ingly, which requires a causal understanding of cause and effect. The last decades have seen an explosion in the availability of time series data and computational resources. This has spurred a large interest in data-driven algorithms able to provide counterfactual inference. As Richard Feynman said, "What I cannot create, I do not understand.", a classical way for counterfactual inference is to understand the underlying working mechanism of the system. Recently Deep Neural Networks have been fundamental in pushing the machine learning field forward with remarkable results in image classification, speech analysis, and machine translation. They also provide a great potential of learning data-generating mechanisms behind time series and thus support counterfactual inference. This thesis explores deep generative models of time series data, with a focus on time series counterfactual inference. Given a system of interest and its time-series observation, our goal is to learn the underlying data generating mechanism with deep generative models and predict potential outcomes under various circumstances. One of the most interesting developments in generative models is the introduction of models merging the powerful function approximators provided by deep neural xi networks with the principled probabilistic approach from graphical models. Here the Variational Autoencoder (VAE) is a seminal contribution to this emerging field. In this thesis, we develop extensions and improvements to the VAE framework to leverage more complex time series and perform effective counterfactual inference. xii Part I Introduction 1 Chapter 1 Introduction 1.1 Background and Motivations Decision-makers face the challenge of estimating what is likely to happen when they take an action. Such problems are often framed in terms of counterfactual questions such as "Would this patient have lower blood sugar had she received a different medication?", or "Would the user have clicked on this ad had it been in a different color?", or "How would other players react, when one football player had passed the ball into a certain area?" [40, 69, 78] As we coming into the era of big data during past decades, observational time series of system behavior are becoming increasingly available in many fields. In medicine, electronic health records (EHR) contain data tracking a patient’s disease progression over time, together with treatment and other medical interventions. As for sports, players’ behavior including their interactions has been widely recorded for analysis. On the mobile phone, it is common for apps to track usage behavior over time. These data can be valuable resources for learning about the efficacy of interventions, which raises the need for data-driven methods in counterfactual inference. For example, using electronic health record data to study the effects of different drugs and treatments can accelerate drug development and improve treatment plans. Similarly, on web platforms, counterfactual inference based on user behavior data e.g. visit, click patterns can provide insight about web design 2 and recommendation systems, leading to better user experiences. Also, in football games, we can leverage rival’s behavior from previous games to make counterfactual inferences on their response to various strategies. In past years, several studies have sought to use observational data to build generative models for counterfactual inference. Most previous studies focus on the effect of a sequence of discrete-time or continuous-time interventions on a single outcome. However, in this proposal, we are interested in counterfactual inference under temporal settings. Namely, we would like to know how a time series would evolve, under the effect of another intervention time series. For any time series, only one potential outcome is observable and the others are contractual, Therefore, commonly used supervised learning algorithms and discriminative models are no longer suitable or reliable for counterfactual inference [84]. Instead, the key to reliable time series counterfactual inference is to learn the underlying data generating mechanism of time series and build generative models for the system. However, approximating underlying data-generating mechanisms from observational time series is quite challenging due to the intrinsic complexity of real-world systems, such as disease progression, web user behavior, football players trajectories, etc. Challenges mainly come from two folds: (1) highly non-linear temporal dynamics which requires flexible and expressive representations; (2) complicated structures in time series e.g. multi-variate time series generated by multiple agents with strong interactions, non-stationary time series with switching dynamics, irregularly-spaced time series, etc. On the other hand, deep neural networks (DNN) i.e. a family of flexible and expressive mappings, have shown great success in a variety of applications including computer vision, natural language process, speech recognition, and also time series 3 analysis [8, 18]. In terms of generative modeling of time series, deep neural networks provide the potential to tackle the above challenges. First, DNNs can fit non-linear dynamics effectively, with their expressive representation power. Second, DNN based models are usually optimized via back-propagation and gradient-based optimizations. Therefore, it is flexible to design network architecture for structured time series. In recent years, a few deep generative models have been proposed for time series analysis, such as variational recurrent network (VRNN), deep Markov model (DMM), neural ordinary differential equations (Neural ODE) [8, 12, 18], to name a few. In this proposal, we would like to explore deep generative models on time series counterfactual inference. Specifically, we are going to build deep state space models (DSSM) to approximate the underlying data generating mechanism and thus answer the counterfactual question: how certain time series would evolve, under the effect of another intervention time series. Based on intrinsic properties of intervention time series, we further divide the task into two settings, with (1) continuous-time interventions and (2) discrete-time interventions, respectively. For example, a bas- ketball player’s behavior in a game could be viewed as continuous-time interventions to other players’ behavior. On the other hand, drugs and other treatments could be viewed as discrete-time interventions to patients’ disease progression. Note that the sequence of discrete-time interventions could even be irregularly spaces, which brings extra challenges to counterfactual inference. 1.2 Research Problems In this section, we are going to formally define the problem of time series counterfactual inference. Consider a system with observational time series{X t :t∈ 4 [0,τ]} which is affected by another time series of interventions{A t :t∈ [0,τ]} To formalize counterfactual time series, we adopt the potential outcomes framework [79,84,90], which uses a collection of random variables X[a] to represent the potential outcome after intervention a. To make counterfactual inference, we are essentially learning the distribution p(X s [a] : s > t|H t ) where t is the current time at the inference andH t is all the previous information up to time t. Basically, we would like to know, given previous time series{X s , A s : s≤ t}, how would future time series progress{X s :s>t} under each possible intervention sequence{A s :s>t}. If we can freely experiment by repeatedly taking interventions and recording the effects, then it is straightforward to fit a counterfactual inference model. However, conducting experiments may not be possible for most case in real world systems. Alternatively, we can leverage historical data, including both time series X and corresponding intervention sequence A. Generally speaking, we can only estimate conditional distribution p({X s :s>t}|{A s :s>t},H t )from historical data, which is different from target distributionp(X s [a] :s>t|H t ). Under two mild assumptions, however, researchers have show that this conditional distribution is equivalent to the counterfactual model p(X s [a] :s>t|H t ). [84] Assumption 1.1 (Consistency). For any time series X, the realization of inter- vention a i ={A s :s>t} implies{X s [a] :s>t} ={X s :s>t} Assumption 1.2 (Stationary) Each time series’s potential outcomes are indepen- dent to the actual interventions assigned to other time series. The consistency assumption indicates potential outcomes for any time series do not change no matter which intervention alternative is observed. [13, 79] The 5 stationary assumption indicates the intervention effects do not change over time, i.e., the potential outcomes of time series are the same across all time series. [80, 81] Problem (Time Series Counterfactual Inference). Given the previous information including previous time series observations and possibly with interventionsH t : {X s , A s : s≤ t} together with potential intervention sequence{A s : s > t}, infer counterfactual distribution p({X s :s>t}|{A s :s>t},H t ) (1.1) To solve this problem, we are going to develop data-driven methods leveraging historical observations, which leads to a machine learning task. Figure 1.2 shows the overall diagram of the task. # Guangyu Li (USC) DGM for Time Series Counterfactual Inference 1 Problem Formulation Our goal is the probability of potential outcomes (the future progress of time series) p({X s [a]:s>t}|H t ) <latexit sha1_base64="+IzZcp6v8m/NUWv9PoKVYeJ2uHw=">AAACInicbVDLSsNAFJ34rPFVdelmsAh1UxIVfCyk4MZlBauFJITJdNIOnTyYuRFKzLe48VfcuFDUleDHOK0RtPXAwJlz7uXee4JUcAWW9WHMzM7NLyxWlszlldW19erG5rVKMklZmyYikZ2AKCZ4zNrAQbBOKhmJAsFugsH5yL+5ZVLxJL6CYcq8iPRiHnJKQEt+9SStm25uuhGBfhDmncJXzs+HFN4pxuoMsOkW5h0e65SI/KLwYc+v1qyGNQaeJnZJaqhEy6++ud2EZhGLgQqilGNbKXg5kcCpYIXpZoqlhA5IjzmaxiRiysvHJxZ4VytdHCZSvxjwWP3dkZNIqWEU6MrRlmrSG4n/eU4G4bGX8zjNgMX0e1CYCQwJHuWFu1wyCmKoCaGS610x7RNJKOhUTR2CPXnyNLneb9gHjf3Lw1qzVcZRQdtoB9WRjY5QE12gFmojiu7RI3pGL8aD8WS8Gu/fpTNG2bOF/sD4/AJkwqMD</latexit> Our task is to develop time series counterfactual inference model from observational data {X i ,A i } N i=1 <latexit sha1_base64="L5aBN35TRcmAyMRwv0a2rBsrs9I=">AAACFHicbVDLSsNAFJ34rPEVdelmsAiCUpIq6EaouHElFewDmhom00k7dDIJMxOhhHyEG3/FjQtF3Lpw5984aSNo64GBc8+5l7n3+DGjUtn2lzE3v7C4tFxaMVfX1jc2ra3tpowSgUkDRywSbR9JwignDUUVI+1YEBT6jLT84WXut+6JkDTit2oUk26I+pwGFCOlJc86NN3UdEOkBn6QtjOPHsGf6kJX0HQzL6XnTnZ3bXpW2a7YY8BZ4hSkDArUPevT7UU4CQlXmCEpO44dq26KhKKYkcx0E0lihIeoTzqachQS2U3HR2VwXys9GERCP67gWP09kaJQylHo6858YTnt5eJ/XidRwVk3pTxOFOF48lGQMKgimCcEe1QQrNhIE4QF1btCPEACYaVzzENwpk+eJc1qxTmuVG9OyrV6EUcJ7II9cAAccApq4ArUQQNg8ACewAt4NR6NZ+PNeJ+0zhnFzA74A+PjG3pknT4=</latexit> Time Series Counterfactual Inference Model Learn Infer Sequence of future interventions (continuous or discrete) History at time t Sequence of future interventions (continuous or discrete) History at time t Previously Observed Data p({X s :s>t}|{A s :s>t},H t ) <latexit sha1_base64="aAUhczGds+EbBaZp4vLZ5zzZqTY=">AAACNXicbZDLSsNAFIYnXmu8VV26GSyCgpSkCooLqbrpwkUFq4WmhMl0okMnF2ZOhBLzUm58D1e6cKGIW1/BSdpFrR4Y+Pm/c5hzfi8WXIFlvRpT0zOzc/OlBXNxaXlltby2fq2iRFLWopGIZNsjigkeshZwEKwdS0YCT7Abr3+e85t7JhWPwisYxKwbkNuQ+5wS0JZbvoh3TCfFTkDgzvPTduYqfIwVPsFgOpn5gMfp6W+6ZxaAEpE2MhfMXbdcsapWUfivsEeigkbVdMvPTi+iScBCoIIo1bGtGLopkcCpYJnpJIrFhPbJLetoGZKAqW5aXJ3hbe30sB9J/ULAhTs+kZJAqUHg6c58TTXJcvM/1knAP+qmPIwTYCEdfuQnAkOE8whxj0tGQQy0IFRyvSumd0QSCjpoU4dgT578V1zXqvZ+tXZ5UKmfjeIooU20hXaQjQ5RHTVQE7UQRY/oBb2jD+PJeDM+ja9h65QxmtlAv8r4/gHikqfm</latexit> Figure 1.1: A diagram of the data-driven time series counterfactual inference task. 1.3 Mathematical Notations We write scalars in lower case,x, vectors in boldface, x, and matrices in capital boldface, X. Functions are denoted with either f,g or h and sometimes f θ ,g θ and h θ to highlight the fact that these functions are neural networks with parameter set θ. We write the density of random variable x as either p(x) or q(x). For those 6 distribution depending on learnable parameters θ, we add additional subscript as p θ (x). A special case is that we identify the empirical data distribution as p D (x). 7 Chapter 2 Counterfactual Inference 2.1 Potential Outcome Framework Counterfactual inference stems from causal inference, as shown in Figure 2.1. In economics and healthcare [34], many researchers focus on estimating the effects of some intervention or policy. In that literature, the difference between the counterfactual outcomes if an intervention had been taken or not been taken is defined as the causal effect of the intervention [69]. Originated from the literature on experimental and observational studies [76, 86] Rubin’s potential outcome framework [38, 82] is the most popular language to formalize counterfactuals and obtain causal effect estimates [79, 90] Definition 2.1 (Potential Outcome) For a set of n individuals X 1 , X 2 ,...X n , the potential outcome of individual X i under intervention a i is defined as the value the outcome variable Y would take if the intervention had been set to a i = A i , and is denoted as Y i Estimation of the individual intervention effect Y from observational data is fundamentally impossible since we can never observe all potential intervention outcomes Y i for all u i simultaneously. This is called the fundamental problem of causal inference in literature. [33, 76] Alternatively, we can estimate the conditional 8 intervention outcome (CIO). Given the feature X and intervention U, the CIO is defined as Y CIO =E[Y[a]|x = X, a = A] (2.1) To obtain consistent treatment effect from observational data, two basic assumptions are essential as follows: Assumption 2.1 (Consistency). For any individual i, a i = A t≥tc implies X t>tc [a] = X t>tc . Assumption 2.2 (Stable Intervention Value Assumption) Each individual’s potential outcomes are independent to the actual intervention assignment of other individuals. The consistency assumption indicates potential outcomes for any time series do not change no matter which intervention alternative is observed. [13, 74, 81, 93] The stationary assumption indicates no interference among individuals, i.e., the potential outcomes of any individual are unrelated to other’s treatment assignment. [80, 81] 2.2 Counterfactual Inference In the last section, we have introduced the definition of counterfactual inference. In this section, we will review methods of data-driven counterfactual inference, i.e., estimating the intervention outcome from observational data. 9 # Guangyu Li (USC) DGM for Time Series Counterfactual Inference Causal Inference Discover Causal Relationship Mathematical Framework Structural Causal Model & Causal Graph (Pearl, 2009; Pearl et.al. 2016) Granger Causality (Granger, 1969; Runge, 2018) Existing Methods Constraint-based methods (Kalisch et al. 2007; Spirtes et al. 2000) Score and search-based methods (Spirtes et al. 1995; Scutari et al. 2019) Hybrid methods (Tsamardinos et al. 2006; Gasse et al. 2012) Estimate Causal Effect Mathematical Framework Potential Outcome Framework (Rubin, 2006; Imbens et al. 2015) Existing Methods Propensity Score Methods (Kessler et al. 2019; Rosenbaum 2019) Doubly Robust Methods (Bang et al. 2005; Funk et al. 2011) Counterfactual Inference Methods (Chipman et al. 2010; Schulam et al. 2017) 1 Background and Research Gap Counterfactual inference stems from Causal Inference Figure 2.1: Position of counterfactual inference in causal inference family. 2.2.1 Propensity Score Methods The propensity score defined as the conditional probability of the assignment to the intervention given the observed covariates propensity score(X) =p(a = A|x = X) (2.2) Theoretically, we can represent the difference between the intervention and control groups by directly modeling the assignment mechanism with propensity scores. There are many methods developed for causal inference using propensity scores including IPW [31, 35] and propensity score matching (PSM) [15, 62]. Usually, there is a two-step procedure for propensity scores based counterfactual inference. In the first step, the propensity scores are estimated using machine learning models. Then the counterfactual effects are estimated by weighting the outcome using the inverse 10 of the estimated propensity score for each individual or matching individuals from the control group to individuals from the treated group based on the similarity of the estimated propensity scores. [102] Despite their popularity and theoretical appeal, a practical problem of propensity-based methods is that the true propensity scores are intrinsically unknown and must be estimated from finite observational data in purely observational studies. Research indicates that misspecification of the propensity score model can result in substantial bias in causal effect estimation. 2.2.2 Counterfactual Inference Models Instead of matching or weighting, we can estimate the counterfactual outcome by directly modeling the conditional distribution p[Y|X, A] or estimate its expec- tation E[Y|X, A], as long as there are hidden confounders. In this way, we are transforming the problem of causal inference into a predictive learning problem: learning the unknown counterfactual outcome models from observational data. Note that this static setting essentially leads to a regression task. Researchers usually call this family of methods counterfactual inference in literature [30, 40, 84, 103]. Counterfactual inference methods approximate the underlying intervention response functions using the conditional mean functions fitted with machine learning algo- rithms. Among all existing methods, the most widely used is the non-parametric Bayesian additive regression tree (BART) model [9], which is good at detecting interactions and discontinuities in modeling. [102]. The main idea of the BART is the ensemble of regression trees: c Y = X i g i (X;T i ,θ i ) + (2.3) 11 where T i corresponds to each individual regression tree. The whole regression model is fitted via Markov Chain Monte Carlo (MCMC) sampling. The BART model and its variants have shown competitive performance on counterfactual inference in a static setting, but suffering from low computational efficiency due to the MCMC sampling during inference. 2.3 Counterfactual Inference for Time Series Although previous methods for counterfactual inference mostly focused on the static setting, there are several papers studying the time series setting. Standard methods for performing counterfactual inference in time series data come from epidemiology literature and include the g-computation formula, inverse prob- ability of intervention weighting of marginal structural models [20, 73, 74] Later [54] improves on the standard marginal structural models by incorporating recur- rent neural networks (RNNs) to estimate the propensity weights and intervention responses. More recently, [57] extends the potential outcomes framework to the continuous- time setting. Several methods have been proposed for estimating counterfactual effect based on the Gaussian process (GP). For example, [89] focus on how would patient’s measurements would react to interventions and introduce GP together with linear time-invariant (LTI) impulse-response model to counterfactual inference task. This method uses GP to capture the baseline progress of a patient’s measurement and uses LTI to model the response to impulse treatments. x(t) = f(t) |{z} baseline progression (GP) + g(t; a) | {z } intervention response (LTI) + (2.4) 12 Later, [84] also study counterfactual inference in treatment response but treat time series as a combination of Gaussian process and marked point process (MPP). In their methods, the MPP captures when the intervention happens, while another action model accounts for which intervention happens, and GP models the outcome effects of the given intervention. Despite their effectiveness in modeling complicated temporal dynamics, Gaussian process based methods have limitations in terms of scalability due to their O(n 3 ) computational complexity. In addition, all of the previously mentioned methods assume there are no hidden confounders, i.e. all variables affecting the intervention assignments and the potential outcomes are observed, which is not testable in practice and probably not true in many situations. [4]. 13 Chapter 3 Deep Generative Models 3.1 Latent Variable Models Inthefollowingthreechapters, welaythetheoreticalfoundationsoftheproposal. We start with latent variable models, an important and widely-used framework for approximating data generating mechanism which is the key and current bottleneck of realistic simulation. Then, we introduce the integration of neural networks and latent variable models a.k.a. deep generative models, a recent major leap forward in the development of generative models. In the end, we focus on the recent progress of deep generative models for time series simulation. As for probabilistic generative models, we are interested in learning a distri- bution p(x) that assigns high probability to realistic data and low probability to everything else. Given such a model, we can either quantitatively measure the likelihood of observed data x, or generate realistic samples from p(x). Further, we would also like to reveal the underlying data generating mechanism and discover more insights about systems of interest. A common approach is to introduce a set of latent variables z∼p(z) where x depends on z through the conditional distribution p(x|z). The distribution p(z) and 14 p(x|z) usually refer to prior and likelihood respectively. The distribution p(x) could be represented through marginalize over latent variable z p(x) = Z p(x, z)dz = Z p(x|z)p(z)dz (3.1) It seems that introducing the latent variable z simply adds nothing but extra complexity. Three benefits are coming with latent variables: • Given that modeling p(x) directly is often intractable, latent variables enable us to model the complex marginal distribution p(x) through more tractable conditional distribution p(x|z) and prior p(z). • Under the same representation power of parametric distributions, latent vari- ables increase the expressive power of the model. For example in the case of Gaussian Mixture Models (GMM), the distribution that we can model with a mixture of Gaussian components is much more expressive than what we could have modeled using a single component. • Latent variables enable us to leverage our prior knowledge about the underlying data generating process. The factorization p(x, z) = p(x|z)p(z) describes a process that generate the data through ancestral sampling, i.e. z is sampled from the prior p(z) followed by sampling x from the conditional distribution p(x|z). Therefore, we could incorporate assumptions of generating process by assign specific structures to latent variables z. On the other hand, we could also introduce learnable structures and gain more insights about the system through the learning process. 15 However, there is no free lunch. Despite all the benefits mentioned above, latent variables bring extra difficulties to the learning process and require specially designed approaches. Our goal is still to fit the marginal distribution p(x) over the observable variables x to that observed in the datasetD. Suppose we have a model for the joint distribution p(x, z;θ), usually by factorization p(x, z;θ) =p(z;θ)p(x|z;θ). The probability p(x = e x;θ) of observing a training data point e x is p(x = e x;θ) = Z z p(x = e x, z;θ)dz (3.2) And for maximum likelihood learning given a datasetD, one needs to maximize log Y x∈D p(x;θ) = X x∈D logp(x;θ) = X x∈D log Z z p(x, z;θ)dz (3.3) Evaluating log R z p(x, z;θ)dz or its gradients∇ θ is usually intractable. Therefore, we need approximations. In addition, since each training data point x∈D requires one gradient evaluation, the approximation needs to be efficient. One approach is approximating by sampling and the most straightforward one is naive Monte Carlo sampling. Basically, we sample latent variable z (1) , z (2) ,..., z (K) uniformly at random and approximate the expectation with sample average. p θ (x) = X All values of z p θ (x, z) =|Z|E z∼Uniform(Z) [p θ (x, z)] (3.4) where E z∼Uniform(Z) [p θ (x, z)]' 1 K K X i=1 p θ (x, z (i) ) (3.5) This works in theory but not in practice. For most z, p θ (x, z) is very low. On the other hand, those z with large p θ (x, z) may never hit likely completions by uniform 16 random sampling. Therefore we need a clever way to select z to reduce variance of the estimator. Suppose we sample latent variables z (1) , z (2) ,..., z (K) from distribution q(z) and approximate expectation with sample average. p θ (x) = X All values of z p θ (x, z) = X z∈Z q(z) q(z) p θ (x, z) =E z∼q(z) p θ (x, z) q(z) (3.6) where E z∼q(z) p θ (x, z) q(z) ' 1 K K X i=1 p θ (x, z (i) ) q(z (j) ) (3.7) What is a good choice for q(z)? There are several well-developed sampling-based inference algorithms, most of which are instances of Markov Chain Monte-Carlo (MCMC) including two popular methods: Gibbs sampling and Metropolis-Hastings. [18] Unfortunately, these sampling-based methods have several important short- comings. First, although sampling-based methods are asymptotically exact, i.e. guaranteed to find a globally optimal solution given enough samples, it is hard to tell how close they are to a good solution given the finite amount of time in practice. Second, it could be an art in itself to choose an appropriate sampling technique to reach a good solution quickly, e.g. a good proposal in Metropolis-Hasting. Third, sampling-based methods are hard to parallelize and thus limit the efficient use of parallel computational hardware which are widely used with neural network training. In this proposal, we are going to use an alternative approach to learn latent variable models, namely the variational inference. Let’s take a close look on the likelihood function of observed data: log X z∈Z p θ (x, z) = log X z∈Z q(z) q(z) p θ (x, z) = log E z∼q(z) p θ (x, z) q(z) (3.8) 17 Since the log(·) is a concave function, by Jensen’s Inequality we have log(E z∼q(z) [f(z)]) = log X z q(z)f(z) ≥ X z q(z) logf(z) (3.9) Choosing f(z) = p θ (x,z) q(z) , we reach a lover bound of the likelihood function, a.k.a. Evidence Lower Bound (ELBO). log E z∼q(z) p θ (x, z) q(z) ≥E z∼q(z) log p θ (x, z) q(z) | {z } ELBO (3.10) [title = Jensen’s Inequality] If λ 1 ,...,λ n are positive numbers and P n i=1 λ i = 1, for real continuous f(x), if f is convex, then f P n i=1 λ i x i ≤ P n i=1 λ i f(x i ) if f is concave, then f P n i=1 λ i x i ≥ P n i=1 λ i f(x i ) We have shown this lower bound holding for any choice ofq(z). In this proposal, q(z) is usually parametrized by φ, i.e. q(z;φ) or q φ (z). Next we are going to show 18 how tight the bound is. It turns out the distance between ELBO and marginal likelihood logp(x;θ) is the Kullbach-Liebler (KL) divergenceD KL q(z;φ)||p(z|x;θ) . logp(x;θ) = X z q(z;φ) log p(x, z;θ) q(z;φ) + logp(x;θ)− X z q(z;φ) log p(x, z;θ) q(z;φ) | {z } KL divergence (3.11) = X z q(z;φ) log p(x, z;θ) q(z;φ) +D KL q(z;φ)||p(z|x;θ) (3.12) ≥ X z q(z;φ) log p(x, z;θ) q(z;φ) | {z } ELBO (3.13) where the equality holds when q(z;φ) =p(z|x;θ) (3.14) To achieve a tight bound, we need to minimize the the KL divergence D KL q(z)||p(z|x;θ) whichisequivalenttomaximizingELBOgiventhatthemarginal likelihood logp(x;θ) does not depend on latent variables z: arg min q(z;φ) D KL (q(z;φ)||p(z|x;θ)) = arg max q(z;φ) X z q(z;φ) log p θ (x, z;θ) q(z;φ) | {z } ELBO (3.15) [title = Divergence between Probability Densities] Given two probability densities q(z) and p(z) inP, a divergence defines a functionD[·][·] :P×P→R for all (q(z),p(z))∈P such that D[q(z)||p(z)]≥ 0, andD[q(z)||p(z)] = 0 iff. q(z) =p(z). Note that this definition does not need to satisfy the triangle inequality nor require symmetry, which is weaker than the definition for a distance function. Kullbach-Liebler Divergence ForcontinuousdensitiestheKullbach-LieblerDivergencefromq(z)topz isdefinedas, 19 D KL [q(z)||p(z)] =E q(z) log q(z) p(z) = R z q(z) log q(z) p(z) dz The KL divergence does not satisfy the triangle inequality and is not symmetric. One interpretation from information theory is that theD KL [q(z)||p(z)] divergence quantifies how many bits a source must transmit to a receiver to communicate the density q assuming p is known in both ends. Recall the two major functions of latent variable based probabilistic generative models: (1) generate realistic samples from p(x;θ); (2) infer latent variables z given observations x. The first one requires learning θ based on marginal likelihood p(x;θ) over dataD, while the second one requires learning the posterior distribution p(z|x). Both are intractable inference problems. However, we can achieve both via maximizing ELBO w.r.t parameter set (θ,φ), a much simpler optimization problem. Here we rearrange 7.8 and formally define ELBO asF(θ,φ). F(θ,φ) = logp(x;θ)−D KL q(z;φ)||p(z|x;θ) (3.16) which highlights that maximizing the ELBO will push the marginal likelihoodp(x;θ) up while decreasing the KL divergence between the approximated posterior q(z;φ) and true posterior p(z|x;θ). 20 In practice, we work with data samples with empirical distribution p D (x). Notice that maximize the empirical likelihood w.r.t θ is actually equivalent to minimize the KL divergence between empirical distribution p D (x) andp θ (x) w.r.t. θ. lim N→∞ arg max θ 1 N N X i logp θ (x i ) = arg max θ Z x p D (x) logp θ (x)dx (3.17) = arg max θ Z x p D (x) logp θ (x)dx (3.18) − Z x p D (x) logp D (x)dx | {z } not dependent on θ (3.19) = arg max θ D KL p D (x)||p θ (x) (3.20) Therefore, maximizing ELBO w.r.t θ,φ is equivalent to minimizing two KL diver- gences: arg min θ,φ D KL p D (x)||p θ (x) θ) +E p(x) h D KL q φ (z)||p θ (z|x) i (3.21) The Expectation-Maximization (EM) algorithm is an important and widely used algorithm for learning latent variable models p(x,z;θ) with parameters θ and latent variable z. EM follows an iterative two-step strategy, • E-step: given an estimate parameterθ t , computep(z|x) and infer latent variable z t through the expected log-likelihoodE z∼p(z|x) logp(x,z;θ). • M-step: based on latent variablesz t from E-step, find a newθ t+1 by optimizing the likelihood p(x|z;θ) w.r.t θ. It turns out the EM algorithm shares the same principle as variational inference, i.e. maximizing the evidence lower bound (ELBO). The EM algorithm can be seen as 21 iteratively optimizing the ELBO over approximate distribution q (at the E-step) and θ (at the M-step). Starting at certain θ t , we compute the posterior p θ (z|x) at E-step and set q =p(z|x,θ) which tighten the ELBO: logp(x;θ t ) =E p(z|x;θt) logp(x,z;θ t )−E p(z|x;θt) logp(z|x;θ t ) Next, we optimize the ELBO over θ with q fixed, which is exactly the optimization problem solved at the m-step of EM. θ t+1 = arg max θ n E p(z|x;θt) logp(x,z;θ t )−E p(z|x;θt) logp(z|x;θ t ) o Updatingθ by solving this problem increases the ELBO. However, the ELBO evaluate at the new θ is no longer tight given q fixed to q =p(z|x,θ). Therefore, we repreat the above procedure to keep maximizing logp(x;θ t ). Figure 3.1: Data generating process via neural networks. (Picture credit by OpenAI Inc.) Generative models are one of the most promising approaches to analyze and understand the underlying generating mechanism behind data. In real-world applica- tions, data distributionsp(x) are usually too complicated to be represented explicitly and analytically. On the other hand, deep neural networks have shown great success 22 in representing complicated mappings. Therefore, it is natural to bring the expressive power of neural networks into latent variable models, leading to deep generative mod- els. Mathematically, we think about a 2-dimensional dataset of examples x 1 ,..., x n as samples from true data distribution p D (x). As shown in Figure 3.1, the blue region represents the space with a high probability over a certain threshold under the true data distribution. The black dots indicate data points observed. To recover the true data distribution, we define a generated distribution p θ (x) implicitly by taking a sample from a unit Gaussian distribution and mapping them through a determin- istic neural network, i.e. the generation network. This network is a function with parameters θ. By tweaking θ, we can manipulate the generated distribution. Our goal then is to find parameters θ that produce a distribution p θ (x) closely matches the true data distribution p D (x), e.g. minimizing KL divergence as mentioned in the last section. Imagine that, starting from random, the generated distribution evolves towards the true data distribution during the training process. Most deep generative models follow this basic setup but differ in the details. Here are two popular families of deep generative models: • Generative Adversarial Networks (GANs) [24] pose the training process as a game between two separate networks: a generator network and a discrimi- native network that tries to classify samples as either coming from the true data distribution p(x) or the generated distribution p θ (x). Every time the discriminator notices a difference between the two distributions, the generation network adjusts its parameters to eliminate such differences. The training process stops until the generation network exactly reproduces the true data distribution while the discriminator is unable to tell a difference, i.e. guessing as random. 23 • Variational Autoencoders (VAEs) [46, 71] formalizes the problem in the frame- work of the probabilistic graphical model and maximizes a lower bound on the log-likelihood of the data, i.e. the evidence lower bound (ELBO). A key innovation in VAEs is to amortize the posterior inference over data points using an approximate posterior distribution q φ (z|x) also parameterized using deep neural networks. Both approaches have their pros and cons. Comparing with Variational Autoen- coders, GANs tend to generate more realistic data in practice, i.e. sharper images, but are more difficult to optimize due to the unstable training dynamics. On the other hand, Variational Autoencoders provide both generation network and inference network, which allows us to perform both learning and efficient Bayesian inference in sophisticated probabilistic graphical models with latent variables, i.e. infer latent variables from observations. Since our research focus is to investigate the underlying data generating mechanism of structured dynamic systems, we adopt the Variational Autoencoders as the base set up in this proposal. In the remaining of this section, we introduce details of VAEs including generative models, inference models, and parameter estimation. 3.1.1 Generative Models The generative model in a vanilla VAE is similar to the joint probability introduced in Section??,p θ (x, z) =p θ (x|z)p θ (z), as illustrated in Figure??(a). Here z is a letent variable with d dimensions, typically with a fully factorized isotropic Gaussian prior, i.e. p θ (z) =N (0, I). The likelihood p θ (x|z) can be any parametric distribution with parameters as a function of z via deep neural networks. The choice of the likelihood distribution depends on data type in specific applications, e.g. 24 Gaussianlikelihoodforuni-modecontinuousobservations, mixtureGaussianformulti- model continuous observations, multinomial distribution for discrete observations. Here we use Gaussian likelihood as an example: p θ (x|z) =N (μ,σ 2 I) (3.22) μ =f μ θ (z) (3.23) logσ =f σ θ (z) (3.24) where x has dimension of d and both μ and σ areR d output variables from neural network mapping f μ θ and f σ θ . In this case, the model parameters θ represents the parameters of two neural networks. 3.1.2 Inference Models Variational Autoencoders follow the same principle of variational inference, i.e. construct approximated posterior distribution q φ (z) and maximize the ELBO F(θ,φ) = logp(x;θ)−D KL q(z;φ)||p(z|x;θ) . In the classical mean field variational inference [43], the variational approximation is a fully factorized parametric distri- bution and a variational approximation is fitted for each data point, i.e. individual variational parameters φ ={φ i } N i=1 are fitted for each data point x ={x i } N i=1 . How- ever, this approach does not scale well to large-sacle datasets when there is no analytic solution for each individual optimization problem. Instead, VAEs parameterize the variational approximations as a function of data x via neural networks sharing across 25 all data samples. For example, in the case of Gaussian latent variables, a common parameterization is q θ (z) =N (μ,σ 2 I) (3.25) μ =g μ φ (x) (3.26) logσ =g σ φ (x) (3.27) where both μ and σ have the same dimension as latent variable z. Similar to generative model, the variational parameters φ are the parameters of two neural networks g μ φ and g σ φ shared across all data points. This approach is known as amortized variational inference since we optimize a number of shared variational parameters [46]. Essentially, such approach turns the individual optimization tasks into predictions, i.e. predict variational parameters from x i via neural network mapping g μ φ and g σ φ . Despite the benefit mentioned above, amortized variational inference suffers from an extra amortization gap, i.e. the difference between the best possible vari- ational approximation and the actual amortized variational approximation. Let us define the optimal approximate posterior within a variational familyQ that 26 maximizes the ELBO as q ∗ (z) = arg max q∈Q F(q), then the total inference gap can be divided into approximation gap and amortization gap: logp(x)−F(q) = logp(x)−F(q ∗ ) | {z } Approximation Gap +F (q ∗ )−F(q) | {z } Amortization Gap (3.28) =D KL h q ∗ φ (z|x)||p(z||x) i | {z } Approximation Gap (3.29) +D KL h q φ (z|x)||p(z||x) i −D KL h q ∗ φ (z|x)||p(z||x) i | {z } Amortization Gap (3.30) Many efforts have been made to minimizing the amortization gap by introducing more expressive posterior distributions including auxiliary variables, hierarchical structure or flow based methods 3.1.3 Parameter Estimation Since VAEs parameterize both generative modelp θ (x|z) and variational approx- imationq φ (z|x) with neural network, we end up with following optimization problem: F(θ,φ) =E q φ h logp θ (x|z) i | {z } reconstruction −D KL q φ (z|x)||p θ (z) | {z } regularization (3.31) The reconstruction term encourages the likelihoodp θ (x|z) to reconstruct the observed data, thus capture the data generating process from latent variables z to observations x. The regularization term encourages the variational posterior q φ (z|x) to approach the prior p θ (z), and thus prevent latent variables from directly mimicking the data distribution p D (x). Based on recent developments on optimizing neural network 27 based models, we assume the parameters are updated via gradient-based algorithms. The key challenge is to obtain low-variance gradient estimators w.r.t. both θ and φ. For generation network, we can easily compute gradients w.r.t. θ using standard back-propagation algorithms. ∇ θ F(θ,φ) =∇ θ Z z q φ (z|x) log p θ (x, z) q φ (z|x) dz (3.32) = Z z q φ (z|x)∇ θ logp θ (x, z)dz (3.33) =Eq φ (z|x) h ∇ θ logp θ (x, z) i (3.34) However, estimating the gradients w.r.t. φ is more complicated due to the dependency on the generative model through z. ∇ φ F(θ,φ) =∇ φ Z z q φ (z|x) logp θ (x, z)dz−∇ φ Z z q φ (z|x) logq φ (x, z)dz (3.35) = Z z ∇ φ q φ (z|x) h logp θ (x, z)dz− logq φ (z|x) i dz (3.36) =E q φ (z|x) log p θ (x, z) q φ (z|x) ∇ φ logq φ (z|x) (3.37) Unfortunately this estimator has high variance due to the scaling term inside the expectation. Instead we leverage the reparameterization trick [71] and obtain an unbiased low-variance estimator for gradients w.r.t. φ and back-propagate gradients through the latent random variable z. Basically we reparameterize q φ (z|x) using a deterministic differentiable transformation g such that z =g(ψ,),∼p(). For 28 example, we set z =g(ψ,) =μ +σ when q φ (z|x) follows a Gaussian distribution. The low variance estimator for both θ and φ is ∇ θ,φ F(θ,φ) =∇ θ,φ E q φ (z|x) log p θ (x, z) q φ (z|x) (3.38) =∇ θ,φ E p() log p θ (x,g(ψ,)) q φ (g(ψ,)|x) (3.39) =E p() ∇ θ,φ log p θ (x,g(ψ,)) q φ (g(ψ,)|x) (3.40) Comparing with previous estimators, the derivative can be pushed into the expec- tation due to the reparameterization. The gradients no longer dependent on the stochastic sampling and we can directly back-propagate through the generative and inference models to get the derivatives for θ and φ. The expectation can then be approximated via Monte Carlo sampling. In addition, for certain distributions e.g. Gaussian distribution, the KL divergence term even have an analytic solution. 3.2 State Space Models State-space models (SSM) provide a general and flexible methodology for sequential data modeling. They were first introduced in the 1960s, with the seminal work of [44], and were soon used in the Apollo Project to estimate the trajectory of the spaceships that were bringing men to the moon. Since then, they have become a standard tool for time series analysis in many areas well beyond aerospace engineering. In the machine learning community, in particular, they are used as generative models for sequential data, predictive modeling, state inference, and representation learning. 29 An excellent treatment of state-space modeling for time series analysis can be found in the book by [18]. Research at USC: Deep Generative Model for Structured Dynamic Systems State Space Assumption latent states observations (time series) Discrete Time Model Continues Time Model z t 1 <latexit sha1_base64="vY61zURhrzo32zMtqMzZ4arP7lY=">AAAB+XicbVDLSsNAFJ34rPUVdelmsAiuSlIFXRbduKxgH9CGMJlO2qGTSZi5KdSQP3HjQhG3/ok7/8ZJm4W2Hhg4nHMv98wJEsE1OM63tba+sbm1Xdmp7u7tHxzaR8cdHaeKsjaNRax6AdFMcMnawEGwXqIYiQLBusHkrvC7U6Y0j+UjzBLmRWQkecgpASP5tj2ICIyDMHvK/Qx8N/ftmlN35sCrxC1JDZVo+fbXYBjTNGISqCBa910nAS8jCjgVLK8OUs0SQidkxPqGShIx7WXz5Dk+N8oQh7EyTwKeq783MhJpPYsCM1nk1MteIf7n9VMIb7yMyyQFJuniUJgKDDEuasBDrhgFMTOEUMVNVkzHRBEKpqyqKcFd/vIq6TTq7mW98XBVa96WdVTQKTpDF8hF16iJ7lELtRFFU/SMXtGblVkv1rv1sRhds8qdE/QH1ucPCR+T6g==</latexit> z t 2 <latexit sha1_base64="GZnPVSz7XzNoEeq0oEjxOrW4RaE=">AAAB+XicbVDLSsNAFJ34rPUVdelmsAiuSlIFXRbduKxgH9CGMJlO2qGTSZi5KdSQP3HjQhG3/ok7/8ZJm4W2Hhg4nHMv98wJEsE1OM63tba+sbm1Xdmp7u7tHxzaR8cdHaeKsjaNRax6AdFMcMnawEGwXqIYiQLBusHkrvC7U6Y0j+UjzBLmRWQkecgpASP5tj2ICIyDMHvK/Qz8Ru7bNafuzIFXiVuSGirR8u2vwTCmacQkUEG07rtOAl5GFHAqWF4dpJolhE7IiPUNlSRi2svmyXN8bpQhDmNlngQ8V39vZCTSehYFZrLIqZe9QvzP66cQ3ngZl0kKTNLFoTAVGGJc1ICHXDEKYmYIoYqbrJiOiSIUTFlVU4K7/OVV0mnU3ct64+Gq1rwt66igU3SGLpCLrlET3aMWaiOKpugZvaI3K7NerHfrYzG6ZpU7J+gPrM8fCqST6w==</latexit> x t 2 <latexit sha1_base64="3V9nO5gbbVjFNY2EdexDvC5IxEA=">AAAB+XicbVDLSsNAFL3xWesr6tLNYBFclaQKuiy6cVnBPqANYTKdtEMnkzAzKZaQP3HjQhG3/ok7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j447Kk4loW0S81j2AqwoZ4K2NdOc9hJJcRRw2g0md4XfnVKpWCwe9SyhXoRHgoWMYG0k37YHEdbjIMyecj/TfiP37ZpTd+ZAq8QtSQ1KtHz7azCMSRpRoQnHSvVdJ9FehqVmhNO8OkgVTTCZ4BHtGypwRJWXzZPn6NwoQxTG0jyh0Vz9vZHhSKlZFJjJIqda9grxP6+f6vDGy5hIUk0FWRwKU450jIoa0JBJSjSfGYKJZCYrImMsMdGmrKopwV3+8irpNOruZb3xcFVr3pZ1VOAUzuACXLiGJtxDC9pAYArP8ApvVma9WO/Wx2J0zSp3TuAPrM8fB46T6Q==</latexit> x t 1 <latexit sha1_base64="jjbbb/L74UeHxrlNyLByEIKY9Tg=">AAAB+XicbVDLSsNAFJ34rPUVdelmsAiuSlIFXRbduKxgH9CGMJlO2qGTSZi5KZaQP3HjQhG3/ok7/8ZJm4W2Hhg4nHMv98wJEsE1OM63tba+sbm1Xdmp7u7tHxzaR8cdHaeKsjaNRax6AdFMcMnawEGwXqIYiQLBusHkrvC7U6Y0j+UjzBLmRWQkecgpASP5tj2ICIyDMHvK/Qx8N/ftmlN35sCrxC1JDZVo+fbXYBjTNGISqCBa910nAS8jCjgVLK8OUs0SQidkxPqGShIx7WXz5Dk+N8oQh7EyTwKeq783MhJpPYsCM1nk1MteIf7n9VMIb7yMyyQFJuniUJgKDDEuasBDrhgFMTOEUMVNVkzHRBEKpqyqKcFd/vIq6TTq7mW98XBVa96WdVTQKTpDF8hF16iJ7lELtRFFU/SMXtGblVkv1rv1sRhds8qdE/QH1ucPBgmT6A==</latexit> Initial State Transition Function Emission Function Initial State ODE Emission Function z 0 <latexit sha1_base64="dDiKOQyYF5N/JX6JDMOLgjWWrWA=">AAAB9XicbVDLSsNAFL2pr1pfUZduBovgqiRV0GXRjcsK9gFtLJPppB06mYSZiVJD/sONC0Xc+i/u/BsnbRbaemDgcM693DPHjzlT2nG+rdLK6tr6RnmzsrW9s7tn7x+0VZRIQlsk4pHs+lhRzgRtaaY57caS4tDntONPrnO/80ClYpG409OYeiEeCRYwgrWR7vsh1mM/SJ+yQepkA7vq1JwZ0DJxC1KFAs2B/dUfRiQJqdCEY6V6rhNrL8VSM8JpVuknisaYTPCI9gwVOKTKS2epM3RilCEKImme0Gim/t5IcajUNPTNZJ5SLXq5+J/XS3Rw6aVMxImmgswPBQlHOkJ5BWjIJCWaTw3BRDKTFZExlphoU1TFlOAufnmZtOs196xWvz2vNq6KOspwBMdwCi5cQANuoAktICDhGV7hzXq0Xqx362M+WrKKnUP4A+vzB/mBktE=</latexit> x t = g(z t ) <latexit sha1_base64="Xxm78C83XU+mtwyCskQtZGflWgQ=">AAACC3icbZDLSsNAFIYn9VbrLerSzdAi1E1JqqAboejGZQV7gTaEyXTSDp1MwsxErKF7N76KGxeKuPUF3Pk2TtII2vrDwM93zmHO+b2IUaks68soLC2vrK4V10sbm1vbO+buXluGscCkhUMWiq6HJGGUk5aiipFuJAgKPEY63vgyrXduiZA05DdqEhEnQENOfYqR0sg1y/0AqZHnJ3dTN1FTeA6H1R90n6Ej16xYNSsTXDR2biogV9M1P/uDEMcB4QozJGXPtiLlJEgoihmZlvqxJBHCYzQkPW05Coh0kuyWKTzUZAD9UOjHFczo74kEBVJOAk93pmvK+VoK/6v1YuWfOQnlUawIx7OP/JhBFcI0GDiggmDFJtogLKjeFeIREggrHV9Jh2DPn7xo2vWafVyrX59UGhd5HEVwAMqgCmxwChrgCjRBC2DwAJ7AC3g1Ho1n4814n7UWjHxmH/yR8fENcRabTA==</latexit> z t+1 = f(z t ) <latexit sha1_base64="HFCH2fzoh1pso3O/LncaRGp6wsU=">AAACDXicbVDLSsNAFJ3UV62vqEs3g1WoCCWpgm6EohuXFewD2hAm00k7dDIJMxOhhvyAG3/FjQtF3Lp35984abOwrQcuHM65l3vv8SJGpbKsH6OwtLyyulZcL21sbm3vmLt7LRnGApMmDlkoOh6ShFFOmooqRjqRICjwGGl7o5vMbz8QIWnI79U4Ik6ABpz6FCOlJdc86gVIDT0/eUzdRJ3aKbyCfmVGTE9cs2xVrQngIrFzUgY5Gq753euHOA4IV5ghKbu2FSknQUJRzEha6sWSRAiP0IB0NeUoINJJJt+k8FgrfeiHQhdXcKL+nUhQIOU48HRndqac9zLxP68bK//SSSiPYkU4ni7yYwZVCLNoYJ8KghUba4KwoPpWiIdIIKx0gCUdgj3/8iJp1ar2WbV2d16uX+dxFMEBOAQVYIMLUAe3oAGaAIMn8ALewLvxbLwaH8bntLVg5DP7YAbG1y9bmpu9</latexit> dz dt = f(z) <latexit sha1_base64="EcvQYwGPHxxTAtstOo8lLHTaqUc=">AAACEXicbVDLSsNAFJ3UV62vqEs3g0Wom5JUQTdC0Y3LCvYBTSiTyaQdOpmEmYlQQ37Bjb/ixoUibt2582+ctAG19cCFwzn3cu89XsyoVJb1ZZSWlldW18rrlY3Nre0dc3evI6NEYNLGEYtEz0OSMMpJW1HFSC8WBIUeI11vfJX73TsiJI34rZrExA3RkNOAYqS0NDBrTiAQTn0nRGrkBel9lqU+VBmEFzCo/ajHA7Nq1a0p4CKxC1IFBVoD89PxI5yEhCvMkJR924qVmyKhKGYkqziJJDHCYzQkfU05Col00+lHGTzSig+DSOjiCk7V3xMpCqWchJ7uzE+U814u/uf1ExWcuynlcaIIx7NFQcKgimAeD/SpIFixiSYIC6pvhXiEdERKh1jRIdjzLy+STqNun9QbN6fV5mURRxkcgENQAzY4A01wDVqgDTB4AE/gBbwaj8az8Wa8z1pLRjGzD/7A+PgGYk6dVw==</latexit> z 0 <latexit sha1_base64="dDiKOQyYF5N/JX6JDMOLgjWWrWA=">AAAB9XicbVDLSsNAFL2pr1pfUZduBovgqiRV0GXRjcsK9gFtLJPppB06mYSZiVJD/sONC0Xc+i/u/BsnbRbaemDgcM693DPHjzlT2nG+rdLK6tr6RnmzsrW9s7tn7x+0VZRIQlsk4pHs+lhRzgRtaaY57caS4tDntONPrnO/80ClYpG409OYeiEeCRYwgrWR7vsh1mM/SJ+yQepkA7vq1JwZ0DJxC1KFAs2B/dUfRiQJqdCEY6V6rhNrL8VSM8JpVuknisaYTPCI9gwVOKTKS2epM3RilCEKImme0Gim/t5IcajUNPTNZJ5SLXq5+J/XS3Rw6aVMxImmgswPBQlHOkJ5BWjIJCWaTw3BRDKTFZExlphoU1TFlOAufnmZtOs196xWvz2vNq6KOspwBMdwCi5cQANuoAktICDhGV7hzXq0Xqx362M+WrKKnUP4A+vzB/mBktE=</latexit> x t = g(z t ) <latexit sha1_base64="Xxm78C83XU+mtwyCskQtZGflWgQ=">AAACC3icbZDLSsNAFIYn9VbrLerSzdAi1E1JqqAboejGZQV7gTaEyXTSDp1MwsxErKF7N76KGxeKuPUF3Pk2TtII2vrDwM93zmHO+b2IUaks68soLC2vrK4V10sbm1vbO+buXluGscCkhUMWiq6HJGGUk5aiipFuJAgKPEY63vgyrXduiZA05DdqEhEnQENOfYqR0sg1y/0AqZHnJ3dTN1FTeA6H1R90n6Ej16xYNSsTXDR2biogV9M1P/uDEMcB4QozJGXPtiLlJEgoihmZlvqxJBHCYzQkPW05Coh0kuyWKTzUZAD9UOjHFczo74kEBVJOAk93pmvK+VoK/6v1YuWfOQnlUawIx7OP/JhBFcI0GDiggmDFJtogLKjeFeIREggrHV9Jh2DPn7xo2vWafVyrX59UGhd5HEVwAMqgCmxwChrgCjRBC2DwAJ7AC3g1Ho1n4814n7UWjHxmH/yR8fENcRabTA==</latexit> Figure 3.2: A graphical representation of a state-space model. 3.2.1 Definitions We are given a sequence of T observation x 1:T = [x 1 ,..., x T ], that possibly depend on some inputs z 1:T = [z 1 ,..., z T ], and we are interested in modelling the distributionp θ (x 1:T |z 1:T ). This is a very general formulation, that can be applied in a wide variety of applications. We may want to model for example how the movements of the steering wheel and the brake/throttle pedals (the inputs/controls to the model) change the position of a car (the observations/outputs). Using z t = x t−1 it is also possible to define autoregressive models, as typically used in the deep learning community when building generative models for text, videos, or speech. In a SSM we introduce at each time step a state variable z t that summarizes all the information coming from the past and determines the present and future evolution of the system. SSMs can then be seen as a temporal extension to the latent variable model introduced in Section 3.1, in which the prior over the latent variables z t at each time step varies over time as it depends on the previous state z t−1 and 30 possibly some inputs z t to the model. We assume that the joint distributions of observations and states given the inputs factorizes as p θ (x 1:T , z 1:T |z 1:T ) =p θ (x 1:T |z 1:T )p θ (z 1;T |z 1:T ) (3.41) = T Y t=1 p θ (x t |z t )·p θ (z 1 ) T Y t=2 p θ (z t |z t−1 , z t ) A graphical representation of the distribution in 3.41 can be found in Figure 3.2. The emission distribution p θ (x t |z t )specifieshowtheobservationx t depends on the latent state z t , and can therefore be seen as the likelihood in a latent variable model. p θ (z t |z t−1 , z t ) is called transition distribution, and represents the prior distribution for the state at each time step given the previous state and the current input to the model. This distribution fully determines the temporal evolution of the system. The states of the SSM form a Markov chain, that captures the temporal correlations and long-term dependencies between observations at different time steps. Using the d-separation properties of the graphical model in Figure 3.2, we can see that this Markovian structure leads to some interesting conditional independence properties that are implicitly assumed in a SSM: • p θ (x t |z 1:t , x 1:t−1 , z 1:t ) =p θ (x t |z t ) This property implies that given the present state z t the observation at time t does not depend on the past states, inputs and outputs of the model. As in a latent variable model, the observation x t is then fully determined by the latent state z t . • p θ (z t |z 1:t−1 , x 1:t−1 , z 1:t ) =p θ (z t |z t−1 , z t ) Conditioned on z t−1 , the current state z t does not depend on the previous 31 states z 1:t−2 , nor the past inputs or outputs. z t−1 then captures all the relevant information on the past. • p θ (z t |z t+1:T , x t+1:T , z t+1:T ) =p θ (z t |z t+1 ) Given the next state z t+1 , z t does not depend on the future states, inputs, and outputs, i.e. z t+1 captures all the relevant information on the future. As we will see in the following, these conditional independence relationships are responsible for many of the nice properties of SSMs. Similarly to the LVM in Section 3.1, the marginal distribution over the observations can be obtained by integrating out the states in 3.41, i.e. p θ (x 1:T |z 1:T ) = Z p θ (x 1:T , z 1:T |z 1:T )dz 1:T (3.42) Here we have assumed that z 1:T are continuous variables, but the same ideas apply to the discrete case by replacing the integral with a summation. Using Bayes’ rule we obtain the posterior distribution of the states given the data: p θ (z 1:T |x 1:T , z 1:T ) = p θ (x 1:T |z 1:T )p θ z 1:T |z 1:T p θ (x 1:T |z 1:T ) (3.43) In some cases, we know the exact form of the emission and transition distributions, and we are only interested in inferring the latent states for a given sequence, e.g. 3D object tracking. SSMs can be also used as black-box methods for sequential data modeling, in which case the emission and transition distribution will have a flexible structure that can be learned from the data. In other cases, we can use prior information on the task at hand to define a specific parametric form for the emission and transition distribution that helps the model to learn meaningful and interpretable latent representations. 32 3.2.2 Posterior Inference in the Sequential Modeling In 3.43 we have expressed the posterior distribution of the latent states given the whole sequence. However, if we take into account the temporal structure of the data, there are also other types of inference we can be interested in. To illustrate this, consider the simplified speech recognition example of trying to understand what a friend is saying in a very noisy bar. The observation x t represents the noisy speech waveform at each time step, while z t is the discrete variable that represents the corresponding word pronounced by the friend. In this example, there are no inputs z t to the model, and we will therefore remove them from the equations. At any point in time, we want of course to infer the word that the friend is saying, i.e. compute the posterior distribution p θ (z t |x t ). As the bar is noisy, however, we may not be sure of which word was pronounced. At time t we also know what the friend said in x 1:t1 , and we can therefore condition even on the past observations, i.e. compute p θ (z t |x 1:t ) instead. The knowledge about what the friend was talking about at previous time steps can provide some context and help us better infer z t . We call this task filtering, as it reduces the noise compared to only using the present observation x t during inference. Despite this, due to the noise in the bar, we may still be unsure of the inferred word. In this case, we can hope that while we keep listening to the friend, there will be one clue that will clarify the inferred state z t . In this case, we are therefore also using knowledge on the future during inference, i.e. we are considering the smoothed posterior p θ (z t |x 1:T ). Finally, we may also want to predict what the friend will say in the future given what was said until now, i.e. compute p θ (z t+k |x 1:t ). We now provide a more general description of these inference tasks: 33 3.2.2.1 Filtering We want to compute the filtered posterior distribution of the state z t given present and past input and output information, i.e. p θ (z t |x 1:t , z 1:t ). This task is particularly interesting in an online setting, as it allows to compute the state estimate as the data comes in. 3.2.2.2 Smoothing When doing smoothing, we compute the posterior p θ (z t |x 1:T , z 1:T ), conditioned not only on the past and present information, but also on future one. Since the smoothed posterior requires the knowledge of the whole sequence, it can be computed only offline. A trade-off between filtering and smoothing is fixed-lag smoothing, where we compute the smoothed posterior only conditioning on data up to k time steps in the future (and not on the whole sequence), i.e. we compute p θ (z t |x 1:t+k , z 1:t+k ). Fixed-lag smoothing can be used to further improve state estimation in an online setting, whenever a delay of k time steps is inadmissible. 3.2.2.3 Prediction We can also be interested in predicting the state of the system k steps in the future given only past information, i.e. computing p θ (z t+k |x 1:t , z 1:t+k ) (notice that if the inputs z t are present, they need to be known up to time t + k). 34 3.2.3 Types of State-space Models Depending on the way of representing temporal dynamics, we can categorize state-space models into two classes: discrete-time models and continuous-time models. 3.2.3.1 Discrete-time Models Discrete-time state-space models represent temporal dynamics with equally- spaced discrete time steps. Given that the data is observed with a certain frequency in many real-world applications, discrete-time models provide a good trade-off between representation power and model complexity. Here we briefly present several classic discrete-time state-space models in the following. Linear Gaussian state-space models. The simplest class of SSMs is that of Linear Gaussian state-space models (LGSSM), first introduced in [44]. As suggested by the name, both transition and emission distributions are Gaussians, and all relationships between states and observations are linear. This makes posterior inference for this model analytically tractable. Despite its simplicity, the LGSSM can be seen as a generalization of many classical models used in time series analysis. As shown in [18], the widely used autoregressive integrated moving average (ARIMA) model can be expressed in state-space form. LGSSM can also model in unified framework trends, seasonal components, explanatory variables, and interventions. 35 A LGSSM is typically written in terms of two equations, that specify the relationship between the latent states at consecutive time steps and the observations: z t = A t z t−1 + B t z t + t (3.44) x t = C t z t +δ t (3.45) The transition model 3.44 describes how to compute the state z t at each time step given the previous state z t−1 and the current input z t . A t and B t is the transition and control matrices respectively, and define a linear relationship between the variables. The transitions are perturbed with a Gaussian process/transition noise t . In a LGSSM we do not observe the state, but only a linearly transformed version of it with additive Gaussian noise, as specified by the emission model 3.45. The emission model can be seen as a linear regression model with time-varying inputs z t . In 3.45, C t is called emission matrix, and the measurement/observation noiseδ t is a Gaussian random variable. HiddenMarkovmodels. A Hidden Markovmodel (HMM)is aSSM withdiscrete latent states [70]. As for the LGSSM, for a HMM we can perform exact posterior inference. Notice that HMMs were developed in parallel to LGSSM, therefore some authors prefer to reserve the term “state-space models” for models with continuous states, and use the term “hidden Markov models” when dealing with discrete states. HerehoweverweprefertoconsiderHMMsasSSMs, astheysatisfyalltheassumptions we made in Section 3.2.1. Non-Linear non-Gaussian state-space models. The linear-Gaussian assump- tions of a LGSSM are often too restrictive for many applications. If we relax them 36 however we introduce an additional challenge since inference becomes intractable and we need to resort to approximate methods. As we will see in Section 3.3, a flexible class of non-linear state-space models is given by deep state-space models (DSSM). In a DSSM, the transition and emission distributions are parameterized with deep neural networks, and efficient training can be achieved with amortized variational inference, computing the required gradients with the back-propagation algorithm. For small data sets, it is also common to model the transitions and emissions with Gaussian processes. 3.2.3.2 Continuous-time Models SSMs can also be used to model continuous-time systems, in which the state is used to represent the dynamics of higher-order linear systems as a first-order differ- ential equation, including ordinary differential equations (ODEs), partial differential equations (PDEs), or other types of equations (e.g., integro-differential or delay equations). Compared with discrete-time models of a fixed time step, continuous-time models try to represent the dynamic system in a more principled way. Typical questions include: What are the equilibrium or time-periodic solutions? Are these solutions stable? What is the long-time asymptotic behavior of general solutions? Do solutions behave chaotically? What kinds of statistical regularities do solutions possess? How do the qualitative dynamics change as any parameters on which the system depends vary? Do equilibria lose stability, do time-periodic solutions appear, or does chaotic behavior arise? 37 Similar to discrete-time model, the continuous-time models also have linear and non-linear version. Most general continuous-time linear dynamical system has form: ˙ z(t) = A(t)z(t) + B(t)z(t) +(t) (3.46) x(t) = C(t)z(t) +δ(t) (3.47) where the temporal dynamic of the system is represented in differential equation 3.46. In most real-world applications, we will typically deal with the time-invariant case Linear Time-Invariant (LTI) state dynamics, which assume A, B, C, D are constant and do not depend on time t. 3.2.4 Summary and Discussion In this chapter, we have introduced state-space models as an extension of LVMs that are suitable for sequential data. The Markovian structure of SSMs introduces some conditional independence properties between the latent variables that can be exploited during inference (filtering, smoothing, and prediction). Non-linear and non-Gaussian SSMs can be used to model more complex sequences, but require approximate inference procedures such as the Extended Kalman Filter (EKF), the Unscented Kalman Filter (UKF), or particle filters. The main focus of this proposal is building non-linear models that can learn complex temporal dynamics of high-dimensional and structured dynamic systems from large unlabelled datasets. Traditional state-space models are however not powerful and/or scalable enough for such applications. In the next chapter, we will therefore introduce a general class of models that use SSMs with highly non-linear 38 transition functions / differential equations and emission functions parameterized by deep neural networks. These models are broadly applicable and can be trained efficiently with the amortized Variational inference ideas presented for VAEs in Section 3.1.2. 3.3 Deep State Space Models 3.3.1 Motivations The main focus of this proposal is unsupervised learning of generative models for structured dynamic systems. We may be interested in an example in learning generative models for human driving behavior from dash camera videos, or in using the data stored in electronic health records (EHRs) to learn a patient representation given the information collected during many different visits. These applications are characterized by: • Complex and high-dimensional temporal distributions. We need to model the high-dimensional observations at each time step; capture long-term tempo- ral dependencies in the data and memorize relevant information, model the uncertainty and variability in the data, and properly propagate them over time. • Large-scale datasets. To learn such complex distributions that possibly depend on hundreds of thousands of parameters we will use very large datasets. We then need scalable models and training procedures. We will solve these tasks by combining ideas from three classes of models closely related to each other. First, as we have seen in Chapter 3.1, we can use VAEs to 39 model complex high dimensional observations by introducing latent variables and using neural networks to parameterize flexible conditional distributions. VAEs allow performing training using stochastic back-propagation with inference networks in a very scalable way. Secondly, we can use recurrent neural networks (RNNs) to model long-term dependencies in the data through their parametric memory cells. Finally, the same ideas that lead to developing VAEs as a deep LVM can be applied to construct deep SSMs, flexible and scalable models for temporal data that offer a principled way to model uncertainty in the latent representation. All these models use neural networks as their main building block, and we will therefore be able to define very expressive and flexible architectures that can be trained in a similar way using stochastic back-propagation and are simple to implement using existing deep learning libraries. Due to their expressiveness, it is not always easy to fully exploit the modeling power of these architectures. 3.3.2 Deep State-Space Models In this section we introduce a broad class of non-linear SSMs with Gaussian transitions and that, similarly to VAEs, use deep neural networks to define flexible transition and emission distributions [49]. For simplicity in the exposition, we will refer to them as deep state-space models (DSSM). In a DSSM the transition distribution is a Gaussian, i.e. p θ (z t |z t−1 , z t ) = N(z t ;μ t , Σ t ), whose mean and diagonal covariance matrix are a function of the previous latent state z t−1 ) and current input z t ) through two deep neural networks: μ t = NN 1 (z t−1 , z t ), log Σ t = NN 2 (z t−1 , z t ) (3.48) 40 If scalability is not an issue, to make the model more general it is also possible to use a Gaussian with full covariance matrix, see [71] for a discussion on possible Gaussian covariance parameterizations. As for VAEs, depending on the type of observations the emission distribution p θ (x t |z t ) is typically chosen to be either a Gaussian distribution (real-valued data) or a Bernoulli distribution (binary data). The parameters of both distributions are computed with deep neural networks with input z t . The exact parametrization of transition and emission probabilities is problem- dependent. For the transitions, the simplest parameterization concatenates z t−1 , z t and passes this vector through a neural network that returns the mean and the diagonal covariance of the prior over z t . However, if for example, we are doing video modeling and z t = x t−1 is an image, it is typically convenient to first pass z t through a (convolutional) neural network that does feature extractor, and then concatenate the resulting vector with z t−1 . [49] use gated transition functions that allow the model to learn to use linear transitions for some latent dimensions and non-linear ones for others. The parameterization for the emission distribution highly depends on the type of observations. Standard deep neural networks are a good default choice, but when dealing with images it is often better to use convolutional architectures. For notational simplicity we assume that the initial state 0 is a fixed and known vector (we could otherwise learn it). The joint distribution is then given by p θ (x 1:T , z 1:T |z 1:T , z 0 ) =p θ (x 1:T |z 1:T )p θ (z 1:T |z 1:T , z 0 ) (3.49) = T Y t=1 p θ (x t |z t )p θ (z t |z t−1 , z t ) (3.50) 41 This distribution specifies the generative process of the data, therefore as in VAEs p θ (x 1:T , z 1:T |z 1:T , z 0 ) is often referred to as generative model. 3.3.3 Connection to System Identification Deep state space models are closely related to the field of nonlinear system iden- tification. Without the latent state assumption, pioneer work in system identification directly models observed time series{x t } t≥0 with x t =f θ (x t−p:t−1 ) + t (3.51) System identification refers to the problem of accurate estimation of b f or, equivalently, b θ that describes the observed time series well enough in order to understand the underlying dynamics [88]. For example, the Box-Jenkins modeling approach [5] assumes an autoregressive integrated moving average ARIMA model for a discrete- time scalar time series{x t } t≥0 , and approximate f θ with polynomial functions. Moreover, blackbox nonlinear models such as neural networks have been successfully used for nonlinear system identification [87]. In addition, Bayesian nonparametric modeling of the noise process has been successfully attempted before in the context of dynamical reconstruction from observed time series [28, 65], which drops the assumption of a known functional form of the deterministic part of the system by leveraging Bayesian neural networks. Instead, they parametrize the data generating mechanism with a feed-forward neural network whose weights and biases are assigned a prior distribution. 42 Compared with neural networks based system identification, deep state space models capture the time series dynamics in the latent space which has three advan- tages: (1) leverage internal property in the system of interest by adding structure into the latent space (2) unify heterogeneous time series; (3) leverage the state-of-the-art representation learning techniques. We will show how these work in latter chapters. 3.3.4 Summary and Discussion In this chapter, we have introduced a family of sequential deep latent variable models for unsupervised learning of complex data distributions from large unlabelled datasets. We start defining the joint distribution of the model in which we encode all the modeling assumptions that are suitable for the particular application at hand. To learn the parameters of the model using Maximum Likelihood, we need to compute the data log-likelihood by marginalizing the joint distribution over the latent variables. Since this integral is often intractable, we define a variational approximation over the latent variables of the model conditioned on the inputs and outputs, and use Jensen’s inequality to derive the ELBO, the objective function used during training. When can design a variational distribution that better approximates the true posterior distribution by making use of the independence properties among the variables of the model? The scalability of the models is ensured using inference networks to define scalable and flexible variational approximations parameterized by deep neural networks. As both the generative model and the variational approximation are defined using neural network architectures, we can train their parameters jointly using stochastic gradient ascent, computing their gradients efficiently on GPU. With deep state-space models as the theoretic framework, we will present several works on modeling structured dynamic systems in the following chapters. 43 Part II Research Work 44 Chapter 4 Research Statement In this chapter, we are going to discuss the limitations of current methods developed for time series counterfactual inference and present the proposed research statement tackling these limitations. Let us recap the problem we are interested in: given the previous information including previous time series observations and possibly with interventionsH t : {X s , A s : s≤ t} together with potential intervention sequence{A s : s > t}, infer counterfactual distribution p({X s :s>t}|{A s :s>t},H t ) (4.1) For most real-world scenarios, we do not fully understand the underlying mechanism of the system, but only have historical observations. Therefore, we would like to transfer the problem into a machine learning task: # Guangyu Li (USC) DGM for Time Series Counterfactual Inference 1 Problem Formulation Our goal is the probability of potential outcomes (the future progress of time series) p({X s [a]:s>t}|H t ) <latexit sha1_base64="+IzZcp6v8m/NUWv9PoKVYeJ2uHw=">AAACInicbVDLSsNAFJ34rPFVdelmsAh1UxIVfCyk4MZlBauFJITJdNIOnTyYuRFKzLe48VfcuFDUleDHOK0RtPXAwJlz7uXee4JUcAWW9WHMzM7NLyxWlszlldW19erG5rVKMklZmyYikZ2AKCZ4zNrAQbBOKhmJAsFugsH5yL+5ZVLxJL6CYcq8iPRiHnJKQEt+9SStm25uuhGBfhDmncJXzs+HFN4pxuoMsOkW5h0e65SI/KLwYc+v1qyGNQaeJnZJaqhEy6++ud2EZhGLgQqilGNbKXg5kcCpYIXpZoqlhA5IjzmaxiRiysvHJxZ4VytdHCZSvxjwWP3dkZNIqWEU6MrRlmrSG4n/eU4G4bGX8zjNgMX0e1CYCQwJHuWFu1wyCmKoCaGS610x7RNJKOhUTR2CPXnyNLneb9gHjf3Lw1qzVcZRQdtoB9WRjY5QE12gFmojiu7RI3pGL8aD8WS8Gu/fpTNG2bOF/sD4/AJkwqMD</latexit> Our task is to develop time series counterfactual inference model from observational data {X i ,A i } N i=1 <latexit sha1_base64="L5aBN35TRcmAyMRwv0a2rBsrs9I=">AAACFHicbVDLSsNAFJ34rPEVdelmsAiCUpIq6EaouHElFewDmhom00k7dDIJMxOhhHyEG3/FjQtF3Lpw5984aSNo64GBc8+5l7n3+DGjUtn2lzE3v7C4tFxaMVfX1jc2ra3tpowSgUkDRywSbR9JwignDUUVI+1YEBT6jLT84WXut+6JkDTit2oUk26I+pwGFCOlJc86NN3UdEOkBn6QtjOPHsGf6kJX0HQzL6XnTnZ3bXpW2a7YY8BZ4hSkDArUPevT7UU4CQlXmCEpO44dq26KhKKYkcx0E0lihIeoTzqachQS2U3HR2VwXys9GERCP67gWP09kaJQylHo6858YTnt5eJ/XidRwVk3pTxOFOF48lGQMKgimCcEe1QQrNhIE4QF1btCPEACYaVzzENwpk+eJc1qxTmuVG9OyrV6EUcJ7II9cAAccApq4ArUQQNg8ACewAt4NR6NZ+PNeJ+0zhnFzA74A+PjG3pknT4=</latexit> Time Series Counterfactual Inference Model Learn Infer Sequence of future interventions (continuous or discrete) History at time t Sequence of future interventions (continuous or discrete) History at time t Previously Observed Data p({X s :s>t}|{A s :s>t},H t ) <latexit sha1_base64="aAUhczGds+EbBaZp4vLZ5zzZqTY=">AAACNXicbZDLSsNAFIYnXmu8VV26GSyCgpSkCooLqbrpwkUFq4WmhMl0okMnF2ZOhBLzUm58D1e6cKGIW1/BSdpFrR4Y+Pm/c5hzfi8WXIFlvRpT0zOzc/OlBXNxaXlltby2fq2iRFLWopGIZNsjigkeshZwEKwdS0YCT7Abr3+e85t7JhWPwisYxKwbkNuQ+5wS0JZbvoh3TCfFTkDgzvPTduYqfIwVPsFgOpn5gMfp6W+6ZxaAEpE2MhfMXbdcsapWUfivsEeigkbVdMvPTi+iScBCoIIo1bGtGLopkcCpYJnpJIrFhPbJLetoGZKAqW5aXJ3hbe30sB9J/ULAhTs+kZJAqUHg6c58TTXJcvM/1knAP+qmPIwTYCEdfuQnAkOE8whxj0tGQQy0IFRyvSumd0QSCjpoU4dgT578V1zXqvZ+tXZ5UKmfjeIooU20hXaQjQ5RHTVQE7UQRY/oBb2jD+PJeDM+ja9h65QxmtlAv8r4/gHikqfm</latexit> Figure 4.1: A diagram of the data-driven time series counterfactual inference task. 45 As we discussed in chapter 2, most existing work for counterfactual inference focus on static setting, while only a few study the time series setting. Compared with the static setting, time-series counterfactual inference is more challenging as it requires modeling the underlying temporal dynamic behind the time-series observations. Besides the intrinsic challenge of modeling temporal dynamics, the complexity of real-world temporal systems brings extra challenges: • Many real-world temporal systems involve strong interaction among multiple agents, e.g. traders, dealers, market makers in financial markets, tens of players in basketball games. Many real-world systems involve strong interaction among multiple agents, e.g. traders, dealers, market makers in financial markets, tens of players in basketball games. How to handle agents and their interactions when doing counterfactual inference has not been deeply explored yet. [101] • In real-world scenarios, time series can be irregularly-spaced or come with multiple temporal resolution. For example, a patient’s electronic health record (EHR) time series usually contains medical records with different frequencies, ranging from per minute to per month. How to build counterfactual inference forirregularly-spacedandmulti-resolutiontimeseriesremainsanactiveresearch field in the time series field. [7] • There might be hidden confounders for time series interventions, i.e. variables affecting the treatment assignments and the potential outcomes that are not observed. On the other hand, existing work usually assumes all confounders are observed, which is not testable in practice and possibly not true in many scenarios. [4] 46 Table 4.1: Features supported by existing methods for counterfactual inference Balancing CR BART GP + MPP GP + LTI RNN [40] [9] [84] [89] [25] Temporal dynamics × × X X X Nonlinear intervention effects X X X × X Multi-agent interactions × × × × × Irregularly-spaced data × × X X × Hidden confounders × × × × × Scalable w.r.t. sample size X × × × X In addition to tackling above mentioned obstacles, time-series counterfactual inference methods, just like other data-driven methods, should be efficient for large- scaleapplications. However, existingmethodswhicharemostlybasedontheGaussian process do not scale well in terms of sample size. Table 4.1 shows a summary of limitations of existing methods for time series counterfactual inference. To achieve good counterfactual inference, the key is to learn the underlying temporal dynamics of the time series. On the other hand, deep generative models i.e. neural network parametrized generative models such as variational recurrent neural networks [12], deep Markov models [48], neural ordinary differential equations [8] etc. have shown promising performance in capturing complicated temporal dynamics, as we have discussed in previous chapters. Therefore, we would like to explore the potential of the deep generative model on handling time series counterfactual inference under the three challenging cases mentioned above. Research Object Develop novel deep generative models for time series and show their effectiveness on time series counterfactual inference under challenging real-world situations, including counterfactual inference for time series with (1) multi-agent interactions; (2) mixed sampling rates; (3) hidden confounders. 47 In the following three chapters, we are going to introduce proposed methods for the above situations. The proposed methods for the first two situations i.e. time series with multi-agent interactions, and time series with mixed sampling rates, have been fully developed and published in [53] and [7] respectively. 48 Chapter 5 Counterfactual Inference for Time Series with Multi-agent Interactions In this chapter, we are interested in counterfactual inference for multi-agent time series, i.e. time series generated from multiple agents with strong interactions. Here we proposed a deep generative model which captures data generating process of multi-agent systems, supports not only the counterfactual inference but also agent groups and interaction types identifications. Built upon advances in deep generative models and a novel attention mechanism, our model can learn interactions in highly heterogeneous systems with linear complexity in the number of agents. We apply this model to three different types of multi-agent time series: spring- ball trajectories, basketball player behavior, and traffic trajectories on campus. Experimental results demonstrate its ability to model multi-agent time series, yielding improved performance over competitive baselines in counterfactual inference. We also show the model can successfully identify agent groups and interaction types in these systems. 49 5.1 A Motivating Example Imagining in a basketball game of team A and B, the coach of team A would like to know if his/her players moved in a certain way, how would the team B players move. This is a time series counterfactual inference problem, in which the counterfactual intervention is team A’s player movement while the potential outcome is team B’s reaction. Usually, a good coach can give a reasonable guess based on his/her experience. In this proposal, we would like to answer this kind of problem with data-driven methods, i.e. infer team B’s reaction with a deep generative model. trained with team B’s behavior in previous games. 5.2 Problem Formulation Consider there are multi-agent time series with K agents X ={x k } K k=1 . For any set of agents ξ, given all the historical information X ≤t and potential future behavior of the set of agents{x i >t } i∈ξ , predict the potential future behavior of other agents{x j >t } j/ ∈ξ . In another word, our objective is to predict p({x j >t } j/ ∈ξ |{x i >t } i∈ξ , X ≤t ) (5.1) Given that the key of time series counterfactual inference is to capture the underlying data generating mechanism of time series, we further transfer the task into developing a generative model for multi-agent time series which support counterfactual inference. Note that we abuse the term of time series with behavior in this chapter. We also use a for the attention vector instead of the intervention vector given that the intervention in a multi-agent time-series setting is a part of the observations X. 50 Hard Spring Soft Spring No Spring Light Ball Medium Ball Heavy Ball Defense Offense Ball Collaboration Hostility Player-Ball Pedestrian Bicycle Car Pedestrian-Bicycle Bicycle-Bicycle Pedestrian-Car Hard Spring Soft Spring No Spring Light Ball Medium Ball Heavy Ball Defense Offense Ball Collaboration Hostility Player-Ball Pedestrian Bicycle Car Pedestrian-Bicycle Bicycle-Bicycle Pedestrian-Car Hard Spring Soft Spring No Spring Light Ball Medium Ball Heavy Ball Defense Offense Ball Collaboration Hostility Player-Ball Pedestrian Bicycle Car Pedestrian-Bicycle Bicycle-Bicycle Pedestrian-Car Figure 5.1: Illustration of three multi-agent systems, including agent groups and interaction types. 5.3 Background and Related Work Modeling underlying data generating processes for multi-agent time series is challenging due to inherent properties in the following aspects: Interaction Complexity To model interactions explicitly, one suffers the com- plexity of O(N d ) where N is the number of agents and d is the typical interaction clique size, which makes it computationally unmanageable to model a large number of agents or high order interactions. Interaction Structure Although the interaction structure can be pre-defined in some simple systems, e.g., spring-systems in physics, most real-world multi-agent systems come with an unknown and evolving interaction structure. Imagining in an open area in the university campus, the number of agents changes frequently, and the interaction between two pedestrians may disappear quickly after they passed by each other. Such unknown and evolving properties require a model to handle interaction adaptively. 51 Heterogeneity In a multi-agent system, agents usually come from groups with different behavior styles, and interactions among them may also have inherently different types. Considering in a basketball game, players are switching between offense and defense frequently. Interactions between players also come with two types, namely collaboration, and hostility. To truly approximate the behavior generating process, one should be able to identify such agent groups and interaction types from system observations. Multi-agent behaviors have drawn significant research interests over the past decades. Pioneer work from Helbing and Molnar [29] introduces a social force model to capture pedestrian motion patterns. This work was later extended in [63] to learn the interaction forces in the human crowd, which relies on hand-crafted features and relative distances.With the emerge of neural networks, Recurrent Neural Networks (RNNs) or Long Short-Term Memory networks (LSTM) based models have been introduced to model multi-agent behavior, [1, 27, 52, 60, 94, 98]. In this line of research, specially designed pooling mechanism is the key to capture interactions. Alahi et al. propose Social LSTM [1] that predicts pedestrians trajectory using RNNs with a novel social pooling layer in the hidden state of nearby pedestrians. Later, Gupta et al. propose Social GAN [27] showing a Multi-Layer Perceptron (MLP) followed by max-pooling works better than social pooling method in a similar task and enjoys better computationally efficiency. Lee et al. [52] target on traffic behavior prediction at intersections and design DESIRE, an RNN Encoder-Decoder framework together with a feature pooling layer. Although pooling layers methods show competitive performance in multi-agent behavior predictions, the lack of interoperability limits our understanding of how multi-agents interact with each other. 52 Besides the pooling mechanism, Graph Neural Network (GNN) has also been introduced to handle the multi-agent system. For example, Battaglia et al. propose Interaction Networks (IN) to learn a physical simulation of objects with binary relations [3], Sukhbaatar et al. propose Communication Networks (CommNets) [92] for learning optimal communications between agents. Later, Hoshen introduces the attention mechanism into GNN and develops VAIN [36] and applies it to soccer data. Although VAIN scales efficiently with the number of agents, it only predicts from a single frame and does not model past trajectories. Another line of research models multi-agent systems in the imitation learning framework. The key idea is to learn policy representations from agent actions and system observations [26, 51, 56]. With the imitation learning framework, these models require an explicitly defined action set, which targets more on continuous control and robotic applications. Deep generative models have been proposed to reveal the underlying data generating process for sequential data [7, 12, 49, 71], to name a few. Deep Markov Model (DMM), a nonlinear state-space model, is proposed in [49] by marrying the ideas of deep neural networks with Kalman filters. In [12], the authors introduced the variational recurrent neural network (VRNN) which glued an RNN with a VAE together to form a stochastic and sequential neural generative model. Recently, there are several explorations of the deep generative model in multi- agent systems. Ivanovia et al. [39] combine Conditional Variational Autoencoder (CVAE) [46, 71] and LSTM to generate behavior of basketball players. In another relevant work [100], Zhan et al. also use VRNN-based generative models [12] and introduce a macro-goals mechanism (i.e., basketball players’ intentions) and break 53 down the multi-agent behavior generation to an evolving process with multiple pre-defined consequences. More recently, Yeh et al. integrate VRNN and GNN into GraphVRNN [99], to the model trajectory in sports. Most of the existing works focus on agent behavior forecasting and provide limited information regarding interactions types or agent groups. 5.4 Generative Attentional Multi-Agent Network We consider a multi-agent system of K agents over time horizon T, which consists of behavior of every agent over T time steps. We establish notations as follow: • LetD denote all behavior for a multi-agent system. • Let x ≤T ={x t } 1≤t≤T denote a system’s behaviors over time horizon T, where x t ={x k t } and x k t is the behavior of agent k at time step t. • Let z ≤T ={z t } 1≤t≤T denote a system’s latent states over time horizonT, where z t = (e t , s t , a t ) = ({e k t } agent k ,{s k t } agent k ,{a k t } agent k ) and e k t , s k t , a k t corresponds to the agent vector, interaction vector, attention vector of agent k at time step t respectively. • Let{θ x ,θ z ,θ s ,θ a } denote the parameter set for generation network π θ and {φ x ,φ z ,φ s ,φ a } denote the parameter set for inference network π φ . Given that modeling underlying data generating mechanism is the key to counterfactual inference, the goal of GAMAN is to approximate the underlying behavior generating process and recover agents’ interactions for multi-agent time 54 s j t 1 <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> s j t <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> s j t+1 <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> e j t+1 <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> e j t <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> e j t 1 <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> x j t 1 <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> x j t <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> x j t+1 <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> a j t+1 <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> a j t <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> a j t 1 <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> s k t 1 <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="hP+6LrUf2d3tZaldqaQQvEKMXyw=">AAAB2XicbZDNSgMxFIXv1L86Vq1rN8EiuCozbnQpuHFZwbZCO5RM5k4bmskMyR2hDH0BF25EfC93vo3pz0JbDwQ+zknIvSculLQUBN9ebWd3b/+gfugfNfzjk9Nmo2fz0gjsilzl5jnmFpXU2CVJCp8LgzyLFfbj6f0i77+gsTLXTzQrMMr4WMtUCk7O6oyaraAdLMW2IVxDC9YaNb+GSS7KDDUJxa0dhEFBUcUNSaFw7g9LiwUXUz7GgUPNM7RRtRxzzi6dk7A0N+5oYkv394uKZ9bOstjdzDhN7Ga2MP/LBiWlt1EldVESarH6KC0Vo5wtdmaJNChIzRxwYaSblYkJN1yQa8Z3HYSbG29D77odBu3wMYA6nMMFXEEIN3AHD9CBLghI4BXevYn35n2suqp569LO4I+8zx84xIo4</latexit> <latexit sha1_base64="GSu6MIeKtS4peM5nlBORgHTfWJc=">AAAB8nicbVBNS8NAFHypX7VWjeLNS7AIXiyJFz0KXjxWsB/QxrDZbtqlm03YfRFqCP4VLx4U8cd489+4aXvQ1oGFYeY93uyEqeAaXffbqqytb2xuVbdrO/XdvX37oN7RSaYoa9NEJKoXEs0El6yNHAXrpYqROBSsG05uSr/7yJTmibzHacr8mIwkjzglaKTAPhrEBMdhlOviIZ8UQY7nXhHYDbfpzuCsEm9BGrBAK7C/BsOEZjGTSAXRuu+5Kfo5UcipYEVtkGmWEjohI9Y3VJKYaT+fpS+cU6MMnShR5kl0ZurvjZzEWk/j0EyWWfWyV4r/ef0Moys/5zLNkEk6PxRlwsHEKatwhlwximJqCKGKm6wOHRNFKJrCaqYEb/nLq6Rz0fTcpnfnQhWO4QTOwINLuIZbaEEbKDzBC7zBu/VsvVof87oq1qK3Q/gD6/MHnHGULg==</latexit> <latexit sha1_base64="GSu6MIeKtS4peM5nlBORgHTfWJc=">AAAB8nicbVBNS8NAFHypX7VWjeLNS7AIXiyJFz0KXjxWsB/QxrDZbtqlm03YfRFqCP4VLx4U8cd489+4aXvQ1oGFYeY93uyEqeAaXffbqqytb2xuVbdrO/XdvX37oN7RSaYoa9NEJKoXEs0El6yNHAXrpYqROBSsG05uSr/7yJTmibzHacr8mIwkjzglaKTAPhrEBMdhlOviIZ8UQY7nXhHYDbfpzuCsEm9BGrBAK7C/BsOEZjGTSAXRuu+5Kfo5UcipYEVtkGmWEjohI9Y3VJKYaT+fpS+cU6MMnShR5kl0ZurvjZzEWk/j0EyWWfWyV4r/ef0Moys/5zLNkEk6PxRlwsHEKatwhlwximJqCKGKm6wOHRNFKJrCaqYEb/nLq6Rz0fTcpnfnQhWO4QTOwINLuIZbaEEbKDzBC7zBu/VsvVof87oq1qK3Q/gD6/MHnHGULg==</latexit> <latexit sha1_base64="ZyA+R5SRheiMjO69BCZBasMT3xY=">AAAB/XicbVC7TsMwFL3hWcorPDYWiwqJhSphgbGChbFI9CG1IXJcp7XqOJHtIJUo4ldYGECIlf9g429w2gzQciRLR+fcq3t8goQzpR3n21paXlldW69sVDe3tnd27b39topTSWiLxDyW3QArypmgLc00p91EUhwFnHaC8XXhdx6oVCwWd3qSUC/CQ8FCRrA2km8f9iOsR0GYqfw+G+d+ps/c3LdrTt2ZAi0StyQ1KNH07a/+ICZpRIUmHCvVc51EexmWmhFO82o/VTTBZIyHtGeowBFVXjZNn6MTowxQGEvzhEZT9fdGhiOlJlFgJousat4rxP+8XqrDSy9jIkk1FWR2KEw50jEqqkADJinRfGIIJpKZrIiMsMREm8KqpgR3/suLpH1ed526e+vUGldlHRU4gmM4BRcuoAE30IQWEHiEZ3iFN+vJerHerY/Z6JJV7hzAH1ifPwq+lZI=</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> s k t <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> s k t+1 <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> e k t 1 <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> e k t <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> e k t+1 <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> x k t+1 <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> x k t <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> x k t 1 <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> a k t 1 <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> a k t <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> a k t+1 <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> s j t 1 <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> <latexit sha1_base64="+Qsv8yI+RipsiLLa0cMubcySJ6o=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGISZzm+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwDCnOVlQ==</latexit> s j t <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> <latexit sha1_base64="xDMmLx0Ko33sF9MhKEzb+3WQ62k=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxUfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAih5Uj</latexit> s j t+1 <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> <latexit sha1_base64="ZtYJF/uZmMLUPIP6Y/Uh25r/oGU=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgznd9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgGB2eVkw==</latexit> e j t+1 <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> <latexit sha1_base64="rcVP2e8BmPnW3Rph4b3Kl4wCxI4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZ3rRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSgpNGvNYdgKigDMBTc00h04igUQBh3Ywuiz89j1IxWJxo8cJeBEZCBYySrSRfHuvFxE9DMIM8tvsLvczfezmvl11as4E+C9xZ6SKZmj49mevH9M0AqEpJ0p1XSfRXkakZpRDXumlChJCR2QAXUMFiUB52SR9jg+N0sdhLM0TGk/UnxsZiZQaR4GZLLKqea8Q//O6qQ7PvYyJJNUg6PRQmHKsY1xUgftMAtV8bAihkpmsmA6JJFSbwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG8YaVhQ==</latexit> e j t <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> <latexit sha1_base64="rnfzDe9oFTpO7+gl2Rhbm0j8Zxk=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxofp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAM0ZUV</latexit> e j t 1 <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> <latexit sha1_base64="DZUC0y/ahbTc5Bl6kJYIoj/wqoE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMb9qxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJYUmjXksOwFRwJmApmaaQyeRQKKAQzsYXRZ++x6kYrG40eMEvIgMBAsZJdpIvr3Xi4geBmEG+W12l/uZPnZz3646NWcC/Je4M1JFMzR8+7PXj2kagdCUE6W6rpNoLyNSM8ohr/RSBQmhIzKArqGCRKC8bJI+x4dG6eMwluYJjSfqz42MREqNo8BMFlnVvFeI/3ndVIfnXsZEkmoQdHooTDnWMS6qwH0mgWo+NoRQyUxWTIdEEqpNYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD9JKVhw==</latexit> x j t 1 <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> <latexit sha1_base64="6/oH6LCZ5kzuKwsJVlK8FTa8Z1Q=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YyYOZG7GG4K+4caGIW//DnX/jpM1CWw8MHM65l3vmeLHgCizr2ygtLC4tr5RXK2vrG5tb5vZOS0WJpKxJIxHJjkcUEzxkTeAgWCeWjASeYG1vdJn77XsmFY/CGxjHzAnIIOQ+pwS05Jp7vYDA0PPTh+w2vcvcFI7tzDWrVs2aAM8TuyBVVKDhml+9fkSTgIVABVGqa1sxOCmRwKlgWaWXKBYTOiID1tU0JAFTTjpJn+FDrfSxH0n9QsAT9fdGSgKlxoGnJ/OsatbLxf+8bgL+uZPyME6AhXR6yE8EhgjnVeA+l4yCGGtCqOQ6K6ZDIgkFXVhFl2DPfnmetE5qtlWzr0+r9YuijjLaRwfoCNnoDNXRFWqgJqLoET2jV/RmPBkvxrvxMR0tGcXOLvoD4/MHEj6Vmg==</latexit> x j t <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> <latexit sha1_base64="o6tp3AfE+zAx9Ra04Y9cTKIFUDQ=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4C2hsl00o6dTMLMjbSE/IobF4q49Ufc+TdO2iy09cDA4Zx7uWeOHwuuwXG+rdLa+sbmVnm7srO7t39gH1bbOkoUZS0aiUh1faKZ4JK1gINg3VgxEvqCdfzJTe53npjSPJL3MIvZICQjyQNOCRjJs6v9kMDYD9Jp9pA+Zl4KmWfXnLozB14lbkFqqEDTs7/6w4gmIZNABdG65zoxDFKigFPBsko/0SwmdEJGrGeoJCHTg3SePcOnRhniIFLmScBz9fdGSkKtZ6FvJvOketnLxf+8XgLB1SDlMk6ASbo4FCQCQ4TzIvCQK0ZBzAwhVHGTFdMxUYSCqatiSnCXv7xK2ud116m7dxe1xnVRRxkdoxN0hlx0iRroFjVRC1E0Rc/oFb1ZmfVivVsfi9GSVewcoT+wPn8AKkiVKA==</latexit> x j t+1 <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> <latexit sha1_base64="PO+OmGFqU05kcAVK/spWGbKwYyw=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxkwczN2INwV9x40IRt/6HO//GSZuFth4YOJxzL/fM8WLBFVjWt1FaWFxaXimvVtbWNza3zO2dlooSSVmTRiKSHY8oJnjImsBBsE4sGQk8wdre6DL32/dMKh6FNzCOmROQQch9TgloyTX3egGBoeenD9ltepe5KRzbmWtWrZo1AZ4ndkGqqEDDNb96/YgmAQuBCqJU17ZicFIigVPBskovUSwmdEQGrKtpSAKmnHSSPsOHWuljP5L6hYAn6u+NlARKjQNPT+ZZ1ayXi/953QT8cyflYZwAC+n0kJ8IDBHOq8B9LhkFMdaEUMl1VkyHRBIKurCKLsGe/fI8aZ3UbKtmX59W6xdFHWW0jw7QEbLRGaqjK9RATUTRI3pGr+jNeDJejHfjYzpaMoqdXfQHxucPDzKVmA==</latexit> a j t+1 <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> <latexit sha1_base64="ztrq22acIr2WyC6zYBbz9i5ucZ4=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjCZTtqxk0mYuRFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0gE1+A4X1ZpYXFpeaW8Wllb39jcsrd3WjpOFWVNGotYdQKimeCSNYGDYJ1EMRIFgrWD0WXht++Z0jyWNzBOmBeRgeQhpwSM5Nt7vYjAMAgzkt9md7mfwbGb+3bVqTkT4L/EnZEqmqHh25+9fkzTiEmggmjddZ0EvIwo4FSwvNJLNUsIHZEB6xoqScS0l03S5/jQKH0cxso8CXii/tzISKT1OArMZJFVz3uF+J/XTSE89zIukxSYpNNDYSowxLioAve5YhTE2BBCFTdZMR0SRSiYwiqmBHf+y39J66TmOjX3+rRav5jVUUb76AAdIRedoTq6Qg3URBQ9oCf0gl6tR+vZerPep6Mla7azi37B+vgG60qVgQ==</latexit> a j t <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> <latexit sha1_base64="jO5t1psVuQIqeeiKoYncOy2FoWI=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtGMnkzAzEUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YECWdKO863tbK6tr6xWdmqbu/s7u3bB7WOilNJaJvEPJa9ACvKmaBtzTSnvURSHAWcdoPJdeF3H6lULBZ3eppQL8IjwUJGsDaSb9cGEdbjIMxwfp895H6mc9+uOw1nBrRM3JLUoUTLt78Gw5ikERWacKxU33US7WVYakY4zauDVNEEkwke0b6hAkdUedkse45OjDJEYSzNExrN1N8bGY6UmkaBmSySqkWvEP/z+qkOL72MiSTVVJD5oTDlSMeoKAINmaRE86khmEhmsiIyxhITbeqqmhLcxS8vk85Zw3Ua7u15vXlV1lGBIziGU3DhAppwAy1oA4EneIZXeLNy68V6tz7moytWuXMIf2B9/gAGnZUR</latexit> a j t 1 <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> <latexit sha1_base64="L8FkkKuMWUnlFYgWnfOJ1kG3xcE=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5hMJ+3YySTM3Ag1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEySCa3CcL6u0sLi0vFJeraytb2xu2ds7LR2nirImjUWsOgHRTHDJmsBBsE6iGIkCwdrB6LLw2/dMaR7LGxgnzIvIQPKQUwJG8u29XkRgGIQZyW+zu9zP4NjNfbvq1JwJ8F/izkgVzdDw7c9eP6ZpxCRQQbTuuk4CXkYUcCpYXumlmiWEjsiAdQ2VJGLayybpc3xolD4OY2WeBDxRf25kJNJ6HAVmssiq571C/M/rphCeexmXSQpM0umhMBUYYlxUgftcMQpibAihipusmA6JIhRMYRVTgjv/5b+kdVJznZp7fVqtX8zqKKN9dICOkIvOUB1doQZqIooe0BN6Qa/Wo/VsvVnv09GSNdvZRb9gfXwD7laVgw==</latexit> h j t 1 <latexit sha1_base64="WlY6J171G1vwVWfN/ObEfKYA7SE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJaFNEvNYdgKsKGeCNjXTnHYSSXEUcNoORpeF376nUrFY3OhxQr0IDwQLGcHaSL6914uwHgZhNsxvs7vcz/Sxm/t21ak5E6C/xJ2RKszQ8O3PXj8maUSFJhwr1XWdRHsZlpoRTvNKL1U0wWSEB7RrqMARVV42SZ+jQ6P0URhL84RGE/XnRoYjpcZRYCaLrGreK8T/vG6qw3MvYyJJNRVkeihMOdIxKqpAfSYp0XxsCCaSmayIDLHERJvCKqYEd/7Lf0nrpOY6Nff6tFq/mNVRhn04gCNw4QzqcAUNaAKBB3iCF3i1Hq1n6816n46WrNnOLvyC9fEN+T+Vig==</latexit> <latexit sha1_base64="WlY6J171G1vwVWfN/ObEfKYA7SE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJaFNEvNYdgKsKGeCNjXTnHYSSXEUcNoORpeF376nUrFY3OhxQr0IDwQLGcHaSL6914uwHgZhNsxvs7vcz/Sxm/t21ak5E6C/xJ2RKszQ8O3PXj8maUSFJhwr1XWdRHsZlpoRTvNKL1U0wWSEB7RrqMARVV42SZ+jQ6P0URhL84RGE/XnRoYjpcZRYCaLrGreK8T/vG6qw3MvYyJJNRVkeihMOdIxKqpAfSYp0XxsCCaSmayIDLHERJvCKqYEd/7Lf0nrpOY6Nff6tFq/mNVRhn04gCNw4QzqcAUNaAKBB3iCF3i1Hq1n6816n46WrNnOLvyC9fEN+T+Vig==</latexit> <latexit sha1_base64="WlY6J171G1vwVWfN/ObEfKYA7SE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJaFNEvNYdgKsKGeCNjXTnHYSSXEUcNoORpeF376nUrFY3OhxQr0IDwQLGcHaSL6914uwHgZhNsxvs7vcz/Sxm/t21ak5E6C/xJ2RKszQ8O3PXj8maUSFJhwr1XWdRHsZlpoRTvNKL1U0wWSEB7RrqMARVV42SZ+jQ6P0URhL84RGE/XnRoYjpcZRYCaLrGreK8T/vG6qw3MvYyJJNRVkeihMOdIxKqpAfSYp0XxsCCaSmayIDLHERJvCKqYEd/7Lf0nrpOY6Nff6tFq/mNVRhn04gCNw4QzqcAUNaAKBB3iCF3i1Hq1n6816n46WrNnOLvyC9fEN+T+Vig==</latexit> <latexit sha1_base64="WlY6J171G1vwVWfN/ObEfKYA7SE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqxk0mYmQg1BH/FjQtF3Pof7vwbJ20XWj0wcDjnXu6ZEyScKe04X1ZpYXFpeaW8Wllb39jcsrd3WipOJaFNEvNYdgKsKGeCNjXTnHYSSXEUcNoORpeF376nUrFY3OhxQr0IDwQLGcHaSL6914uwHgZhNsxvs7vcz/Sxm/t21ak5E6C/xJ2RKszQ8O3PXj8maUSFJhwr1XWdRHsZlpoRTvNKL1U0wWSEB7RrqMARVV42SZ+jQ6P0URhL84RGE/XnRoYjpcZRYCaLrGreK8T/vG6qw3MvYyJJNRVkeihMOdIxKqpAfSYp0XxsCCaSmayIDLHERJvCKqYEd/7Lf0nrpOY6Nff6tFq/mNVRhn04gCNw4QzqcAUNaAKBB3iCF3i1Hq1n6816n46WrNnOLvyC9fEN+T+Vig==</latexit> h j t <latexit sha1_base64="L4J8oYl1c8YCDYclszXVBErPNbk=">AAAB+3icbVDLSsNAFJ34rPUV69LNYBFclUQEXRbduKxgH9DGMJlO2rGTSZi5EUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YEieAaHOfbWlldW9/YrGxVt3d29/btg1pHx6mirE1jEateQDQTXLI2cBCslyhGokCwbjC5LvzuI1Oax/IOpgnzIjKSPOSUgJF8uzaICIyDMBvn99lD7meQ+3bdaTgz4GXilqSOSrR8+2swjGkaMQlUEK37rpOAlxEFnAqWVwepZgmhEzJifUMliZj2sln2HJ8YZYjDWJknAc/U3xsZibSeRoGZLJLqRa8Q//P6KYSXXsZlkgKTdH4oTAWGGBdF4CFXjIKYGkKo4iYrpmOiCAVTV9WU4C5+eZl0zhqu03Bvz+vNq7KOCjpCx+gUuegCNdENaqE2ougJPaNX9Gbl1ov1bn3MR1escucQ/YH1+QMReJUY</latexit> <latexit sha1_base64="L4J8oYl1c8YCDYclszXVBErPNbk=">AAAB+3icbVDLSsNAFJ34rPUV69LNYBFclUQEXRbduKxgH9DGMJlO2rGTSZi5EUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YEieAaHOfbWlldW9/YrGxVt3d29/btg1pHx6mirE1jEateQDQTXLI2cBCslyhGokCwbjC5LvzuI1Oax/IOpgnzIjKSPOSUgJF8uzaICIyDMBvn99lD7meQ+3bdaTgz4GXilqSOSrR8+2swjGkaMQlUEK37rpOAlxEFnAqWVwepZgmhEzJifUMliZj2sln2HJ8YZYjDWJknAc/U3xsZibSeRoGZLJLqRa8Q//P6KYSXXsZlkgKTdH4oTAWGGBdF4CFXjIKYGkKo4iYrpmOiCAVTV9WU4C5+eZl0zhqu03Bvz+vNq7KOCjpCx+gUuegCNdENaqE2ougJPaNX9Gbl1ov1bn3MR1escucQ/YH1+QMReJUY</latexit> <latexit sha1_base64="L4J8oYl1c8YCDYclszXVBErPNbk=">AAAB+3icbVDLSsNAFJ34rPUV69LNYBFclUQEXRbduKxgH9DGMJlO2rGTSZi5EUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YEieAaHOfbWlldW9/YrGxVt3d29/btg1pHx6mirE1jEateQDQTXLI2cBCslyhGokCwbjC5LvzuI1Oax/IOpgnzIjKSPOSUgJF8uzaICIyDMBvn99lD7meQ+3bdaTgz4GXilqSOSrR8+2swjGkaMQlUEK37rpOAlxEFnAqWVwepZgmhEzJifUMliZj2sln2HJ8YZYjDWJknAc/U3xsZibSeRoGZLJLqRa8Q//P6KYSXXsZlkgKTdH4oTAWGGBdF4CFXjIKYGkKo4iYrpmOiCAVTV9WU4C5+eZl0zhqu03Bvz+vNq7KOCjpCx+gUuegCNdENaqE2ougJPaNX9Gbl1ov1bn3MR1escucQ/YH1+QMReJUY</latexit> <latexit sha1_base64="L4J8oYl1c8YCDYclszXVBErPNbk=">AAAB+3icbVDLSsNAFJ34rPUV69LNYBFclUQEXRbduKxgH9DGMJlO2rGTSZi5EUvIr7hxoYhbf8Sdf+OkzUJbDwwczrmXe+YEieAaHOfbWlldW9/YrGxVt3d29/btg1pHx6mirE1jEateQDQTXLI2cBCslyhGokCwbjC5LvzuI1Oax/IOpgnzIjKSPOSUgJF8uzaICIyDMBvn99lD7meQ+3bdaTgz4GXilqSOSrR8+2swjGkaMQlUEK37rpOAlxEFnAqWVwepZgmhEzJifUMliZj2sln2HJ8YZYjDWJknAc/U3xsZibSeRoGZLJLqRa8Q//P6KYSXXsZlkgKTdH4oTAWGGBdF4CFXjIKYGkKo4iYrpmOiCAVTV9WU4C5+eZl0zhqu03Bvz+vNq7KOCjpCx+gUuegCNdENaqE2ougJPaNX9Gbl1ov1bn3MR1escucQ/YH1+QMReJUY</latexit> h j t+1 <latexit sha1_base64="vxdOasm7Xp8wZE2FxhaLxbau64Y=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjC4Lv31PpWKxuNHjhHoRHggWMoK1kXx7rxdhPQzCbJjfZne5n+ljN/ftqlNzJkB/iTsjVZih4dufvX5M0ogKTThWqus6ifYyLDUjnOaVXqpogskID2jXUIEjqrxskj5Hh0bpozCW5gmNJurPjQxHSo2jwEwWWdW8V4j/ed1Uh+dexkSSairI9FCYcqRjVFSB+kxSovnYEEwkM1kRGWKJiTaFVUwJ7vyX/5LWSc11au71abV+MaujDPtwAEfgwhnU4Qoa0AQCD/AEL/BqPVrP1pv1Ph0tWbOdXfgF6+Mb9jOViA==</latexit> <latexit sha1_base64="vxdOasm7Xp8wZE2FxhaLxbau64Y=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjC4Lv31PpWKxuNHjhHoRHggWMoK1kXx7rxdhPQzCbJjfZne5n+ljN/ftqlNzJkB/iTsjVZih4dufvX5M0ogKTThWqus6ifYyLDUjnOaVXqpogskID2jXUIEjqrxskj5Hh0bpozCW5gmNJurPjQxHSo2jwEwWWdW8V4j/ed1Uh+dexkSSairI9FCYcqRjVFSB+kxSovnYEEwkM1kRGWKJiTaFVUwJ7vyX/5LWSc11au71abV+MaujDPtwAEfgwhnU4Qoa0AQCD/AEL/BqPVrP1pv1Ph0tWbOdXfgF6+Mb9jOViA==</latexit> <latexit sha1_base64="vxdOasm7Xp8wZE2FxhaLxbau64Y=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjC4Lv31PpWKxuNHjhHoRHggWMoK1kXx7rxdhPQzCbJjfZne5n+ljN/ftqlNzJkB/iTsjVZih4dufvX5M0ogKTThWqus6ifYyLDUjnOaVXqpogskID2jXUIEjqrxskj5Hh0bpozCW5gmNJurPjQxHSo2jwEwWWdW8V4j/ed1Uh+dexkSSairI9FCYcqRjVFSB+kxSovnYEEwkM1kRGWKJiTaFVUwJ7vyX/5LWSc11au71abV+MaujDPtwAEfgwhnU4Qoa0AQCD/AEL/BqPVrP1pv1Ph0tWbOdXfgF6+Mb9jOViA==</latexit> <latexit sha1_base64="vxdOasm7Xp8wZE2FxhaLxbau64Y=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRjJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutHpg4HDOvdwzJ0g4U9pxvqzSwuLS8kp5tbK2vrG5ZW/vtFScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjC4Lv31PpWKxuNHjhHoRHggWMoK1kXx7rxdhPQzCbJjfZne5n+ljN/ftqlNzJkB/iTsjVZih4dufvX5M0ogKTThWqus6ifYyLDUjnOaVXqpogskID2jXUIEjqrxskj5Hh0bpozCW5gmNJurPjQxHSo2jwEwWWdW8V4j/ed1Uh+dexkSSairI9FCYcqRjVFSB+kxSovnYEEwkM1kRGWKJiTaFVUwJ7vyX/5LWSc11au71abV+MaujDPtwAEfgwhnU4Qoa0AQCD/AEL/BqPVrP1pv1Ph0tWbOdXfgF6+Mb9jOViA==</latexit> s k t 1 <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="hP+6LrUf2d3tZaldqaQQvEKMXyw=">AAAB2XicbZDNSgMxFIXv1L86Vq1rN8EiuCozbnQpuHFZwbZCO5RM5k4bmskMyR2hDH0BF25EfC93vo3pz0JbDwQ+zknIvSculLQUBN9ebWd3b/+gfugfNfzjk9Nmo2fz0gjsilzl5jnmFpXU2CVJCp8LgzyLFfbj6f0i77+gsTLXTzQrMMr4WMtUCk7O6oyaraAdLMW2IVxDC9YaNb+GSS7KDDUJxa0dhEFBUcUNSaFw7g9LiwUXUz7GgUPNM7RRtRxzzi6dk7A0N+5oYkv394uKZ9bOstjdzDhN7Ga2MP/LBiWlt1EldVESarH6KC0Vo5wtdmaJNChIzRxwYaSblYkJN1yQa8Z3HYSbG29D77odBu3wMYA6nMMFXEEIN3AHD9CBLghI4BXevYn35n2suqp569LO4I+8zx84xIo4</latexit> <latexit sha1_base64="GSu6MIeKtS4peM5nlBORgHTfWJc=">AAAB8nicbVBNS8NAFHypX7VWjeLNS7AIXiyJFz0KXjxWsB/QxrDZbtqlm03YfRFqCP4VLx4U8cd489+4aXvQ1oGFYeY93uyEqeAaXffbqqytb2xuVbdrO/XdvX37oN7RSaYoa9NEJKoXEs0El6yNHAXrpYqROBSsG05uSr/7yJTmibzHacr8mIwkjzglaKTAPhrEBMdhlOviIZ8UQY7nXhHYDbfpzuCsEm9BGrBAK7C/BsOEZjGTSAXRuu+5Kfo5UcipYEVtkGmWEjohI9Y3VJKYaT+fpS+cU6MMnShR5kl0ZurvjZzEWk/j0EyWWfWyV4r/ef0Moys/5zLNkEk6PxRlwsHEKatwhlwximJqCKGKm6wOHRNFKJrCaqYEb/nLq6Rz0fTcpnfnQhWO4QTOwINLuIZbaEEbKDzBC7zBu/VsvVof87oq1qK3Q/gD6/MHnHGULg==</latexit> <latexit sha1_base64="GSu6MIeKtS4peM5nlBORgHTfWJc=">AAAB8nicbVBNS8NAFHypX7VWjeLNS7AIXiyJFz0KXjxWsB/QxrDZbtqlm03YfRFqCP4VLx4U8cd489+4aXvQ1oGFYeY93uyEqeAaXffbqqytb2xuVbdrO/XdvX37oN7RSaYoa9NEJKoXEs0El6yNHAXrpYqROBSsG05uSr/7yJTmibzHacr8mIwkjzglaKTAPhrEBMdhlOviIZ8UQY7nXhHYDbfpzuCsEm9BGrBAK7C/BsOEZjGTSAXRuu+5Kfo5UcipYEVtkGmWEjohI9Y3VJKYaT+fpS+cU6MMnShR5kl0ZurvjZzEWk/j0EyWWfWyV4r/ef0Moys/5zLNkEk6PxRlwsHEKatwhlwximJqCKGKm6wOHRNFKJrCaqYEb/nLq6Rz0fTcpnfnQhWO4QTOwINLuIZbaEEbKDzBC7zBu/VsvVof87oq1qK3Q/gD6/MHnHGULg==</latexit> <latexit sha1_base64="ZyA+R5SRheiMjO69BCZBasMT3xY=">AAAB/XicbVC7TsMwFL3hWcorPDYWiwqJhSphgbGChbFI9CG1IXJcp7XqOJHtIJUo4ldYGECIlf9g429w2gzQciRLR+fcq3t8goQzpR3n21paXlldW69sVDe3tnd27b39topTSWiLxDyW3QArypmgLc00p91EUhwFnHaC8XXhdx6oVCwWd3qSUC/CQ8FCRrA2km8f9iOsR0GYqfw+G+d+ps/c3LdrTt2ZAi0StyQ1KNH07a/+ICZpRIUmHCvVc51EexmWmhFO82o/VTTBZIyHtGeowBFVXjZNn6MTowxQGEvzhEZT9fdGhiOlJlFgJousat4rxP+8XqrDSy9jIkk1FWR2KEw50jEqqkADJinRfGIIJpKZrIiMsMREm8KqpgR3/suLpH1ed526e+vUGldlHRU4gmM4BRcuoAE30IQWEHiEZ3iFN+vJerHerY/Z6JJV7hzAH1ifPwq+lZI=</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> <latexit sha1_base64="NpoAauitXXUYMAXS7gRlqbYIAEc=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCTOX32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz8L/pWW</latexit> s k t <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> <latexit sha1_base64="EhNENI53Pils2FtK/CVG9kOmUzg=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwkznD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAkEJUk</latexit> s k t+1 <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> <latexit sha1_base64="SWVuw5T8S0obp5CpAxqT+DZ6Yv0=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEmcrvs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8ACPKVlA==</latexit> e k t 1 <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> <latexit sha1_base64="wseL5rvQ7Ou5MqrMBqbvCzjAmvw=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCjOb32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/2HZWI</latexit> e k t <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> <latexit sha1_base64="ivnXVSPdNvoq84w3mVAqO6dmYfo=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozlD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAOWpUW</latexit> e k t+1 <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> <latexit sha1_base64="qXUw99w1SSC19o8EV92QlzEQLGI=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc3vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A8xGVhg==</latexit> x k t+1 <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> <latexit sha1_base64="inc1H91MSWrMBf8iAVleFi6Tc0k=">AAAB/XicbVDLSsNAFL3xWesrPnZuBosgCCURQZdFNy4r2Ae0MUymk3boZBJmJmINwV9x40IRt/6HO//GSduFth4YOJxzL/fMCRLOlHacb2thcWl5ZbW0Vl7f2Nzatnd2mypOJaENEvNYtgOsKGeCNjTTnLYTSXEUcNoKhleF37qnUrFY3OpRQr0I9wULGcHaSL69342wHgRh9pDfZcPcz/SJm/t2xak6Y6B54k5JBaao+/ZXtxeTNKJCE46V6rhOor0MS80Ip3m5myqaYDLEfdoxVOCIKi8bp8/RkVF6KIyleUKjsfp7I8ORUqMoMJNFVjXrFeJ/XifV4YWXMZGkmgoyORSmHOkYFVWgHpOUaD4yBBPJTFZEBlhiok1hZVOCO/vledI8rbpO1b05q9Qup3WU4AAO4RhcOIcaXEMdGkDgEZ7hFd6sJ+vFerc+JqML1nRnD/7A+vwBEL2VmQ==</latexit> x k t <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> <latexit sha1_base64="iIBYmFIP3Hq+aVgJ/DWJ9ZsZDQs=">AAAB+3icbVDLSsNAFL3xWesr1qWbwSK4KokIuiy6cVnBPqCNYTKdtEMnkzAzkZaQX3HjQhG3/og7/8ZJm4W2Hhg4nHMv98wJEs6Udpxva219Y3Nru7JT3d3bPzi0j2odFaeS0DaJeSx7AVaUM0HbmmlOe4mkOAo47QaT28LvPlGpWCwe9CyhXoRHgoWMYG0k364NIqzHQZhN88dskvuZzn277jScOdAqcUtShxIt3/4aDGOSRlRowrFSfddJtJdhqRnhNK8OUkUTTCZ4RPuGChxR5WXz7Dk6M8oQhbE0T2g0V39vZDhSahYFZrJIqpa9QvzP66c6vPYyJpJUU0EWh8KUIx2jogg0ZJISzWeGYCKZyYrIGEtMtKmrakpwl7+8SjoXDddpuPeX9eZNWUcFTuAUzsGFK2jCHbSgDQSm8Ayv8Gbl1ov1bn0sRtescucY/sD6/AEr0ZUp</latexit> x k t 1 <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> <latexit sha1_base64="wZ6/x0sM8txWx0VOoAZYmEtWWZE=">AAAB/XicbVDLSsNAFL3xWesrPnZuBovgxpKIoMuiG5cV7APaGCbTSTt0MgkzE7GG4K+4caGIW//DnX/jpO1CWw8MHM65l3vmBAlnSjvOt7WwuLS8slpaK69vbG5t2zu7TRWnktAGiXks2wFWlDNBG5ppTtuJpDgKOG0Fw6vCb91TqVgsbvUooV6E+4KFjGBtJN/e70ZYD4Iwe8jvsmHuZ/rEzX274lSdMdA8caekAlPUffur24tJGlGhCcdKdVwn0V6GpWaE07zcTRVNMBniPu0YKnBElZeN0+foyCg9FMbSPKHRWP29keFIqVEUmMkiq5r1CvE/r5Pq8MLLmEhSTQWZHApTjnSMiipQj0lKNB8ZgolkJisiAywx0aawsinBnf3yPGmeVl2n6t6cVWqX0zpKcACHcAwunEMNrqEODSDwCM/wCm/Wk/VivVsfk9EFa7qzB39gff4AE8mVmw==</latexit> a k t 1 <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> <latexit sha1_base64="D8dvSDyX8ThRDbwDD/bIQIEGcCE=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEN5ZEBF0W3bisYB/QxjCZTtqhk0mYmQg1BH/FjQtF3Pof7vwbJ20X2npg4HDOvdwzJ0g4U9pxvq3S0vLK6lp5vbKxubW9Y+/utVScSkKbJOax7ARYUc4EbWqmOe0kkuIo4LQdjK4Lv/1ApWKxuNPjhHoRHggWMoK1kXz7oBdhPQzCDOf32Sj3M33q5r5ddWrOBGiRuDNShRkavv3V68ckjajQhGOluq6TaC/DUjPCaV7ppYommIzwgHYNFTiiyssm6XN0bJQ+CmNpntBoov7eyHCk1DgKzGSRVc17hfif1011eOllTCSppoJMD4UpRzpGRRWozyQlmo8NwUQykxWRIZaYaFNYxZTgzn95kbTOaq5Tc2/Pq/WrWR1lOIQjOAEXLqAON9CAJhB4hGd4hTfryXqx3q2P6WjJmu3swx9Ynz/v4ZWE</latexit> a k t <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> <latexit sha1_base64="9Es+bRgR0LJuC8yZkqCth/DF6wU=">AAAB+3icbVDLSsNAFJ3UV62vWJduBovgqiQi6LLoxmUF+4A2hsl00g6dTMLMjVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSATX4DjfVmVtfWNzq7pd29nd2z+wD+tdHaeKsg6NRaz6AdFMcMk6wEGwfqIYiQLBesH0pvB7j0xpHst7mCXMi8hY8pBTAkby7fowIjAJwozkD9k09zPIfbvhNJ058CpxS9JAJdq+/TUcxTSNmAQqiNYD10nAy4gCTgXLa8NUs4TQKRmzgaGSREx72Tx7jk+NMsJhrMyTgOfq742MRFrPosBMFkn1sleI/3mDFMIrL+MySYFJujgUpgJDjIsi8IgrRkHMDCFUcZMV0wlRhIKpq2ZKcJe/vEq6503Xabp3F43WdVlHFR2jE3SGXHSJWugWtVEHUfSEntErerNy68V6tz4WoxWr3DlCf2B9/gAIJpUS</latexit> a k t+1 <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> <latexit sha1_base64="LJAgOf1kxL8VSr5w4JEQSeeqAxA=">AAAB/XicbVDLSsNAFL2pr1pf8bFzM1gEQSiJCLosunFZwT6gjWEynbRDJ5MwMxFqCP6KGxeKuPU/3Pk3TtoutPXAwOGce7lnTpBwprTjfFulpeWV1bXyemVjc2t7x97da6k4lYQ2Scxj2QmwopwJ2tRMc9pJJMVRwGk7GF0XfvuBSsVicafHCfUiPBAsZARrI/n2QS/CehiEGc7vs1HuZ/rUzX276tScCdAicWekCjM0fPur149JGlGhCcdKdV0n0V6GpWaE07zSSxVNMBnhAe0aKnBElZdN0ufo2Ch9FMbSPKHRRP29keFIqXEUmMkiq5r3CvE/r5vq8NLLmEhSTQWZHgpTjnSMiipQn0lKNB8bgolkJisiQywx0aawiinBnf/yImmd1Vyn5t6eV+tXszrKcAhHcAIuXEAdbqABTSDwCM/wCm/Wk/VivVsf09GSNdvZhz+wPn8A7NWVgg==</latexit> h k t+1 <latexit sha1_base64="bG0S/0P1HA0lxa5PDzWpORUjrkk=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjKZTtqhk0mYuRFqCP6KGxeKuPU/3Pk3TtostPXAwOGce7lnjh8LrsFxvq3S0vLK6lp5vbKxubW9Y+/utXSUKMqaNBKR6vhEM8ElawIHwTqxYiT0BWv74+vcbz8wpXkk72ASMy8kQ8kDTgkYqW8f9EICIz9IR9l9Os76KZy6Wd+uOjVnCrxI3IJUUYFG3/7qDSKahEwCFUTrruvE4KVEAaeCZZVeollM6JgMWddQSUKmvXSaPsPHRhngIFLmScBT9fdGSkKtJ6FvJvOset7Lxf+8bgLBpZdyGSfAJJ0dChKBIcJ5FXjAFaMgJoYQqrjJiumIKELBFFYxJbjzX14krbOa69Tc2/Nq/aqoo4wO0RE6QS66QHV0gxqoiSh6RM/oFb1ZT9aL9W59zEZLVrGzj/7A+vwB976ViQ==</latexit> <latexit sha1_base64="bG0S/0P1HA0lxa5PDzWpORUjrkk=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjKZTtqhk0mYuRFqCP6KGxeKuPU/3Pk3TtostPXAwOGce7lnjh8LrsFxvq3S0vLK6lp5vbKxubW9Y+/utXSUKMqaNBKR6vhEM8ElawIHwTqxYiT0BWv74+vcbz8wpXkk72ASMy8kQ8kDTgkYqW8f9EICIz9IR9l9Os76KZy6Wd+uOjVnCrxI3IJUUYFG3/7qDSKahEwCFUTrruvE4KVEAaeCZZVeollM6JgMWddQSUKmvXSaPsPHRhngIFLmScBT9fdGSkKtJ6FvJvOset7Lxf+8bgLBpZdyGSfAJJ0dChKBIcJ5FXjAFaMgJoYQqrjJiumIKELBFFYxJbjzX14krbOa69Tc2/Nq/aqoo4wO0RE6QS66QHV0gxqoiSh6RM/oFb1ZT9aL9W59zEZLVrGzj/7A+vwB976ViQ==</latexit> <latexit sha1_base64="bG0S/0P1HA0lxa5PDzWpORUjrkk=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjKZTtqhk0mYuRFqCP6KGxeKuPU/3Pk3TtostPXAwOGce7lnjh8LrsFxvq3S0vLK6lp5vbKxubW9Y+/utXSUKMqaNBKR6vhEM8ElawIHwTqxYiT0BWv74+vcbz8wpXkk72ASMy8kQ8kDTgkYqW8f9EICIz9IR9l9Os76KZy6Wd+uOjVnCrxI3IJUUYFG3/7qDSKahEwCFUTrruvE4KVEAaeCZZVeollM6JgMWddQSUKmvXSaPsPHRhngIFLmScBT9fdGSkKtJ6FvJvOset7Lxf+8bgLBpZdyGSfAJJ0dChKBIcJ5FXjAFaMgJoYQqrjJiumIKELBFFYxJbjzX14krbOa69Tc2/Nq/aqoo4wO0RE6QS66QHV0gxqoiSh6RM/oFb1ZT9aL9W59zEZLVrGzj/7A+vwB976ViQ==</latexit> <latexit sha1_base64="bG0S/0P1HA0lxa5PDzWpORUjrkk=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCIJREBF0W3bisYB/QxjKZTtqhk0mYuRFqCP6KGxeKuPU/3Pk3TtostPXAwOGce7lnjh8LrsFxvq3S0vLK6lp5vbKxubW9Y+/utXSUKMqaNBKR6vhEM8ElawIHwTqxYiT0BWv74+vcbz8wpXkk72ASMy8kQ8kDTgkYqW8f9EICIz9IR9l9Os76KZy6Wd+uOjVnCrxI3IJUUYFG3/7qDSKahEwCFUTrruvE4KVEAaeCZZVeollM6JgMWddQSUKmvXSaPsPHRhngIFLmScBT9fdGSkKtJ6FvJvOset7Lxf+8bgLBpZdyGSfAJJ0dChKBIcJ5FXjAFaMgJoYQqrjJiumIKELBFFYxJbjzX14krbOa69Tc2/Nq/aqoo4wO0RE6QS66QHV0gxqoiSh6RM/oFb1ZT9aL9W59zEZLVrGzj/7A+vwB976ViQ==</latexit> h k t <latexit sha1_base64="RZhXWAebK7psx4SEYxnFSkkKOKQ=">AAAB+3icbVDLSsNAFL2pr1pfsS7dDBbBVUlE0GXRjcsK9gFtDJPppB06mYSZiVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSDhT2nG+rcra+sbmVnW7trO7t39gH9a7Kk4loR0S81j2A6woZ4J2NNOc9hNJcRRw2gumN4Xfe6RSsVjc61lCvQiPBQsZwdpIvl0fRlhPgjCb5A/ZNPcznft2w2k6c6BV4pakASXavv01HMUkjajQhGOlBq6TaC/DUjPCaV4bpoommEzxmA4MFTiiysvm2XN0apQRCmNpntBorv7eyHCk1CwKzGSRVC17hfifN0h1eOVlTCSppoIsDoUpRzpGRRFoxCQlms8MwUQykxWRCZaYaFNXzZTgLn95lXTPm67TdO8uGq3rso4qHMMJnIELl9CCW2hDBwg8wTO8wpuVWy/Wu/WxGK1Y5c4R/IH1+QMTAZUZ</latexit> <latexit sha1_base64="RZhXWAebK7psx4SEYxnFSkkKOKQ=">AAAB+3icbVDLSsNAFL2pr1pfsS7dDBbBVUlE0GXRjcsK9gFtDJPppB06mYSZiVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSDhT2nG+rcra+sbmVnW7trO7t39gH9a7Kk4loR0S81j2A6woZ4J2NNOc9hNJcRRw2gumN4Xfe6RSsVjc61lCvQiPBQsZwdpIvl0fRlhPgjCb5A/ZNPcznft2w2k6c6BV4pakASXavv01HMUkjajQhGOlBq6TaC/DUjPCaV4bpoommEzxmA4MFTiiysvm2XN0apQRCmNpntBorv7eyHCk1CwKzGSRVC17hfifN0h1eOVlTCSppoIsDoUpRzpGRRFoxCQlms8MwUQykxWRCZaYaFNXzZTgLn95lXTPm67TdO8uGq3rso4qHMMJnIELl9CCW2hDBwg8wTO8wpuVWy/Wu/WxGK1Y5c4R/IH1+QMTAZUZ</latexit> <latexit sha1_base64="RZhXWAebK7psx4SEYxnFSkkKOKQ=">AAAB+3icbVDLSsNAFL2pr1pfsS7dDBbBVUlE0GXRjcsK9gFtDJPppB06mYSZiVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSDhT2nG+rcra+sbmVnW7trO7t39gH9a7Kk4loR0S81j2A6woZ4J2NNOc9hNJcRRw2gumN4Xfe6RSsVjc61lCvQiPBQsZwdpIvl0fRlhPgjCb5A/ZNPcznft2w2k6c6BV4pakASXavv01HMUkjajQhGOlBq6TaC/DUjPCaV4bpoommEzxmA4MFTiiysvm2XN0apQRCmNpntBorv7eyHCk1CwKzGSRVC17hfifN0h1eOVlTCSppoIsDoUpRzpGRRFoxCQlms8MwUQykxWRCZaYaFNXzZTgLn95lXTPm67TdO8uGq3rso4qHMMJnIELl9CCW2hDBwg8wTO8wpuVWy/Wu/WxGK1Y5c4R/IH1+QMTAZUZ</latexit> <latexit sha1_base64="RZhXWAebK7psx4SEYxnFSkkKOKQ=">AAAB+3icbVDLSsNAFL2pr1pfsS7dDBbBVUlE0GXRjcsK9gFtDJPppB06mYSZiVhCfsWNC0Xc+iPu/BsnbRbaemDgcM693DMnSDhT2nG+rcra+sbmVnW7trO7t39gH9a7Kk4loR0S81j2A6woZ4J2NNOc9hNJcRRw2gumN4Xfe6RSsVjc61lCvQiPBQsZwdpIvl0fRlhPgjCb5A/ZNPcznft2w2k6c6BV4pakASXavv01HMUkjajQhGOlBq6TaC/DUjPCaV4bpoommEzxmA4MFTiiysvm2XN0apQRCmNpntBorv7eyHCk1CwKzGSRVC17hfifN0h1eOVlTCSppoIsDoUpRzpGRRFoxCQlms8MwUQykxWRCZaYaFNXzZTgLn95lXTPm67TdO8uGq3rso4qHMMJnIELl9CCW2hDBwg8wTO8wpuVWy/Wu/WxGK1Y5c4R/IH1+QMTAZUZ</latexit> h k t 1 <latexit sha1_base64="bVgiN0Fs7f5WjwerxDYYqnF7M8k=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5lMJ+3QySTM3Ag1BH/FjQtF3Pof7vwbJ20W2npg4HDOvdwzx48F1+A431ZpaXllda28XtnY3NresXf3WjpKFGVNGolIdXyimeCSNYGDYJ1YMRL6grX98XXutx+Y0jySdzCJmReSoeQBpwSM1LcPeiGBkR+ko+w+HWf9FE7drG9XnZozBV4kbkGqqECjb3/1BhFNQiaBCqJ113Vi8FKigFPBskov0SwmdEyGrGuoJCHTXjpNn+FjowxwECnzJOCp+nsjJaHWk9A3k3lWPe/l4n9eN4Hg0ku5jBNgks4OBYnAEOG8CjzgilEQE0MIVdxkxXREFKFgCquYEtz5Ly+S1lnNdWru7Xm1flXUUUaH6AidIBddoDq6QQ3URBQ9omf0it6sJ+vFerc+ZqMlq9jZR39gff4A+sqViw==</latexit> <latexit sha1_base64="bVgiN0Fs7f5WjwerxDYYqnF7M8k=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5lMJ+3QySTM3Ag1BH/FjQtF3Pof7vwbJ20W2npg4HDOvdwzx48F1+A431ZpaXllda28XtnY3NresXf3WjpKFGVNGolIdXyimeCSNYGDYJ1YMRL6grX98XXutx+Y0jySdzCJmReSoeQBpwSM1LcPeiGBkR+ko+w+HWf9FE7drG9XnZozBV4kbkGqqECjb3/1BhFNQiaBCqJ113Vi8FKigFPBskov0SwmdEyGrGuoJCHTXjpNn+FjowxwECnzJOCp+nsjJaHWk9A3k3lWPe/l4n9eN4Hg0ku5jBNgks4OBYnAEOG8CjzgilEQE0MIVdxkxXREFKFgCquYEtz5Ly+S1lnNdWru7Xm1flXUUUaH6AidIBddoDq6QQ3URBQ9omf0it6sJ+vFerc+ZqMlq9jZR39gff4A+sqViw==</latexit> <latexit sha1_base64="bVgiN0Fs7f5WjwerxDYYqnF7M8k=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5lMJ+3QySTM3Ag1BH/FjQtF3Pof7vwbJ20W2npg4HDOvdwzx48F1+A431ZpaXllda28XtnY3NresXf3WjpKFGVNGolIdXyimeCSNYGDYJ1YMRL6grX98XXutx+Y0jySdzCJmReSoeQBpwSM1LcPeiGBkR+ko+w+HWf9FE7drG9XnZozBV4kbkGqqECjb3/1BhFNQiaBCqJ113Vi8FKigFPBskov0SwmdEyGrGuoJCHTXjpNn+FjowxwECnzJOCp+nsjJaHWk9A3k3lWPe/l4n9eN4Hg0ku5jBNgks4OBYnAEOG8CjzgilEQE0MIVdxkxXREFKFgCquYEtz5Ly+S1lnNdWru7Xm1flXUUUaH6AidIBddoDq6QQ3URBQ9omf0it6sJ+vFerc+ZqMlq9jZR39gff4A+sqViw==</latexit> <latexit sha1_base64="bVgiN0Fs7f5WjwerxDYYqnF7M8k=">AAAB/XicbVDLSsNAFJ3UV62v+Ni5GSyCG0sigi6LblxWsA9oY5lMJ+3QySTM3Ag1BH/FjQtF3Pof7vwbJ20W2npg4HDOvdwzx48F1+A431ZpaXllda28XtnY3NresXf3WjpKFGVNGolIdXyimeCSNYGDYJ1YMRL6grX98XXutx+Y0jySdzCJmReSoeQBpwSM1LcPeiGBkR+ko+w+HWf9FE7drG9XnZozBV4kbkGqqECjb3/1BhFNQiaBCqJ113Vi8FKigFPBskov0SwmdEyGrGuoJCHTXjpNn+FjowxwECnzJOCp+nsjJaHWk9A3k3lWPe/l4n9eN4Hg0ku5jBNgks4OBYnAEOG8CjzgilEQE0MIVdxkxXREFKFgCquYEtz5Ly+S1lnNdWru7Xm1flXUUUaH6AidIBddoDq6QQ3URBQ9omf0it6sJ+vFerc+ZqMlq9jZR39gff4A+sqViw==</latexit> Agent Vectors Interaction Vectors Observations Self-Mapping Interaction Mapping … … K 1 <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> Agents with j 6= k <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> K 1 <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> <latexit sha1_base64="AvMAumOmxflfNUnNYlkMfOGvTkQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBiyURQY9FL4KXivYD2lA220m7dLMJuxuhhP4ELx4U8eov8ua/cdvmoK0PBh7vzTAzL0gE18Z1v53Cyura+kZxs7S1vbO7V94/aOo4VQwbLBaxagdUo+ASG4Ybge1EIY0Cga1gdDP1W0+oNI/loxkn6Ed0IHnIGTVWerg783rlilt1ZyDLxMtJBXLUe+Wvbj9maYTSMEG17nhuYvyMKsOZwEmpm2pMKBvRAXYslTRC7WezUyfkxCp9EsbKljRkpv6eyGik9TgKbGdEzVAvelPxP6+TmvDKz7hMUoOSzReFqSAmJtO/SZ8rZEaMLaFMcXsrYUOqKDM2nZINwVt8eZk0z6ueW/XuLyq16zyOIhzBMZyCB5dQg1uoQwMYDOAZXuHNEc6L8+58zFsLTj5zCH/gfP4AexmNQQ==</latexit> Agents with j 6= k <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> <latexit sha1_base64="lJr2JyO7vWfccFNND+HuWDUX+Vc=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cK9gPaUDbbSbtmswm7G6GE/ggvHhTx6u/x5r9x2+agrQ8GHu/NMDMvSAXXxnW/ndLa+sbmVnm7srO7t39QPTxq6yRTDFssEYnqBlSj4BJbhhuB3VQhjQOBnSC6nfmdJ1SaJ/LBTFL0YzqSPOSMGit1HklfIokG1Zpbd+cgq8QrSA0KNAfVr/4wYVmM0jBBte55bmr8nCrDmcBppZ9pTCmL6Ah7lkoao/bz+blTcmaVIQkTZUsaMld/T+Q01noSB7Yzpmasl72Z+J/Xy0x47edcpplByRaLwkwQk5DZ72TIFTIjJpZQpri9lbAxVZQZm1DFhuAtv7xK2hd1z61795e1xk0RRxlO4BTOwYMraMAdNKEFDCJ4hld4c1LnxXl3PhatJaeYOYY/cD5/AIPbjwQ=</latexit> Attention Vectors Hidden States Figure 5.2: Generation and inference network of GAMAN for the k-th agent in a K-agent system. Note that we only consider forward setting for the sake of clarity. series. To achieve this goal, GAMAN adopts latent state assumption for the system: at every time step t, there is a latent state z t representing full information of the system, including each agent’s state and their interactions. As the system evolving, new latent state z t+1 is generated from current latent state z t and the behavior observation x t+1 is generated from new latent state z t+1 . GAMAN represents this process by a generation network with an attention mechanism parametrized by θ, from which we can sample the system’s behavior observations x ≤T conditioned on its latent states z ≤T . x ≤T ∼p θ (x ≤T |z ≤T ) (5.2) In order to learn the network, we maximize the marginal log likelihood of all behavior observations which is the summation of marginal log likelihood of every observation x <T . θ ∗ = arg max θ X D L(θ) = arg max θ X D T X t=1 logp θ (x) (5.3) 55 Given the generation network, the marginal log-likelihood of one behavior observation p θ (x) could be calculated by integrating out all possible latent states z. However, stochastic latent states z cannot be analytically integrated out. Following the same trick used in VAE [46], VRNN [12] and DMM [49], we resort to the well- known variational principle and design a inference network parameterized by φ , and maximize the variational evidence lower bound (ELBO)F(θ,φ)≤L(θ) with respect to both θ and φ. The generation network and inference network are illustrated in Figure 5.2. In this section, we present the design of the generation and inference network in detail. Note that though GAMAN supports heterogeneous systems with multiple agent types, we only consider a single agent type in this section for the sake of clarity. 5.4.1 Generation Network The generation network of GAMAN follows the emission and transition framework, which is designed by applying deep neural networks to continuous state-space models. Transition We design transition network from latent state z t−1 ={e t−1 , s t−1 , a t−1 } to z t ={e t , s t , a t } in GAMAN to learn the temporal dependencies in multi-agent systems. We also introduce attention mechanism into the transition network to capture and reveal agents’ interactions. The transition network contains three sub- functions: agent function f θe , interaction function f θs and attention function f θa . In GAMAN, we use gated recurrent units (GRU) [11] for agent function f θe , multilayer perceptron (MLP) for interaction function f θs and attention function f θa . 56 At time stept, the agent vector of k-th agent e k t is first sampled from transition distribution π θe e k t ∼π θe e k t e k <t ,{s j t−1 } j6=k ,{a j t−1 } j6=k ;θ e (5.4) In GAMAN, we model the transition distribution as a multivariate Gaussian distribu- tionπ θe =N (μ (θ) k t , Σ (θ) k t ;θ e ). We parametrizeπ θe with the agent functionf θe , which takes previous agent vectors of the k-th agent e k <t and all other agents’ interaction vectors{s j t } j6=k weighted by the weight w k,j =Softmax k (−||a t k − a t j || 2 ). (μ (θ) k t , Σ (θ) k t ) =f θe e k <t , X j6=k w k,j s j t−1 ! =f θe e k <t , P j6=k e −||a k t−1 −a j t−1 || 2 s j t−1 P j6=k e −||a k t−1 −a j t−1 || 2 ! (5.5) Then the interaction vector s k t and attention vector a k t of the k-th agent are updated through interaction function f θs and attention function f θa respectively. s k t =f θs (e k t ) a k t =f θa (e k t ) (5.6) Combining equations 5.4, 5.5 and 5.6, GAMAN generates the latent state z t = {e t , s t , a t } from previous one z t−1 ={e t−1 , s t−1 , a t−1 }. The motivation behind such attention mechanism comes as follows. In a multi-agent system, an agent’s temporal dynamic is affected by itself and other agents’ interactions. Ideally, the transition distribution should be parametrized by the combination of the agent encoding vector e t k and all cliques of interaction (could be 2nd or higher-order). However, these interaction cliques are not known a-priori in most scenarios and usually require O(N d ) encoding evaluations where 57 N is the number of agents and d is the typical interaction clique size. To address these limitations, we design the attention mechanism to approximate other agents’ interactions by a weighted average of each agent’s interaction encoding vector{s j t } j6=k as in Equation 5.5. With such linear approximation, GAMAN could incorporate interactions in a multi-agent system efficiently as well as provide a measure of the interaction between agents through the weight w k,j . Although the Softmax weight operation scales quadratically with the number of agents, it cost significantly less computation than evaluating interaction encoding s which scales linearly in a number of agents. Therefore the overall computation still scales linearly in the number of agents. Furthermore, Softmax operation usually yields sparse results, in which the interaction can be further approximated by K-nearest neighbor measured in attention, boosting the computational efficiency. Beyond efficiency advantage, the proposed attention mechanism is also robust to variation in the number of agents. Because no matter how the amount of agents varies, the Softmax weighted average will adaptively distribute the interaction within the system. Emission The multi-agent system’s behavior x is generated from latent states z in the emission process. At time step t, behavior of the k-th agent x k t is sampled from emission distri- bution π θx x k t ∼π θx (x k t |e k t ;θ x ) (5.7) The choice of emission distribution π θx is flexible and depends on applications. Multinomial distribution is used for categorical data while Gaussian distribution is 58 used for continuous data. For multi-agent systems, mixture distribution is adopted given the multi-modal nature of the agent’s behavior. The emission distributionπ θx is parametrized by a multilayer perception (MLP) mappingf θx , which takes the agent vector of the k-th agent e k t as input. For example, consider a Gaussian distribution (each with mean μ (x) k t , covariance Σ (x) k t and weight w k t ) as the emission distribution π θx , {μ (x) k t , Σ (x) k t ,w k t } =f θx (e k t ) (5.8) To summarize, the overall generation process is described in Algorithm 7.1. The parameter set of generation network is θ ={θ x ,θ e ,θ s ,θ a }. The joint probability of behavior and latent states can be factorized as following Equation 5.9. p θ (x ≤T , z ≤T |z 0 ) =p θx (x ≤T |e ≤T )·p θe e ≤T |z 0 ;θ s ,θ a = T Y t=1 " p θx (x t |e t )·p θe (e t |e t−1 , s t−1 , a t−1 ;θ s ,θ a ) # = T Y t=1 K Y k=1 " p θx (x k t |e k t )·p θe e k t |e k t−1 ,{s j t−1 } j6=k , {a j t−1 } j6=k ;θ s ,θ a !# (5.9) 5.4.2 Inference Network The goal of the inference network is to obtain an objective that can be optimized easily, makingthemodelparameterlearningamenable. Insteadofdirectlymaximizing marginal log-likelihoodL(θ) w.r.t. θ, we build an inference network with a tractable distributionπ φ and maximize the variational evidence lower bound (ELBO)F(θ,φ)≤ 59 L(θ) w.r.t both θ and φ. Here φ is the parameter set of the inference network which will be formally defined at the end of this section. The overall ELBO is summed over each time step of the behavior demonstration. F(θ,φ) =E q φ (z ≤T |x ≤T ) " T X t=1 logp θ (x t |z ≤T ) −D KL (q φ (z ≤T |x ≤T , z 0 )||p θ (z ≤T |z 0 )) # (5.10) where the expectation is taken w.r.t q φ (z ≤T |x ≤T ), the approximated posterior of latent states z provided by the inference network. To get a tight bound and an accurate estimate from GAMAN, we design the inference network as follows. First, we keep the same temporal dependency of latent states z in the inference network, leading to the factorization: q φ (z ≤T |x ≤T ) = T Y t=1 q φ (z t |z <t , x ≤T ) (5.11) Further, the inference network inherits both attention mechanism and corre- sponding conditional independence from the generation network. Thus the right-hand side of Equation 5.11 can be further factorized as: q φ (z t |z <t , x ≤T ) =q φ (e t , s t , a t |e <t , s t−1 , a t−1 , x ≤T ) =q φ (e t |e <t , s t−1 , a t−1 , x ≤T ) = K Y k=1 q φ (e k t |e k <t ,{s j t−1 } j6=k ,{a j t−1 } j6=k , x k ≤T ) (5.12) 60 Algorithm 5.1: Generation Network of GAMAN in a K-agent system with mixture of Gaussian emission distribution. 1: for k = 1,...,K do 2: Initialize e k 0 ∼N (0, I) 3: Initialize s k 0 =f θs (e k 0 ) a k 0 =f θa (e k 0 ) 4: end for 5: for t = 1,...,T do 6: for k = 1,...,K do 7: (μ (θ) k t , Σ (θ) k t ) =f θe e k <t , P j6=k w k,j s j t−1 ! 8: Sample e k t ∼N (μ (θ) k t , Σ (θ) k t ;θ e ) 9: Compute s k t =f θs (e k t ) a k t =f θa (e k t ) 10: end for 11: for k = 1,...,K do 12: Compute{μ (x) k t , Σ (x) k t ,w k t ,} m =f θx (e k t ) 13: Sample x k t ∼ P m i=i {w k t } i N ({μ (x) k t , Σ (x) k t } i ) 14: end for 15: end for while non-stochastic interaction vector s k t and attention vector a k t are mapped directly from agent vector e k t s k t =f θs (e k t ) a k t =f θa (e k t ) (5.13) where functionf θs andf θa are inherited from generation network. Further, we model the right hand side of Equation 5.12q φ (e k t |e k <t ,{s j t−1 } j6=k ,{a j t−1 } j6=k , x k ≤T ) as Gaussian distribution π k φe =N (μ (φ) k t ,μ (φ) k t ;φ e ) parametrized by gated recurrent units (GRU) g φe (μ (φ) k t , Σ (φ) k t ) =g φe h k t , e k <t , P j6=k e ||a k t−1 −a j t−1 || 2 s j t−1 P j6=k e ||a k t−1 −a j t−1 || 2 ! (5.14) 61 Algorithm 5.2: Learning GAMAN with stochastic backpropagation and Adam optimizer [45]. Input: A set of behavior demonstrationsD; Initial (θ,φ) 1: while not converged do 2: Choose a random minibatch of behaviorsX⊂D 3: for each sample x 1:T ∈X do 4: for k = 1,...,K do 5: Compute h k 1:T by function φ h on input x k 1:T 6: Sample c e k 0 ∼N (0, I) 7: Sample s k 0 =f θs (e k 0 ) a k 0 =f θa (e k 0 ) 8: end for 9: for t = 1,··· ,T do 10: for k = 1,...,K do 11: Estimate parameters (μ (θ) k t , Σ (θ) k t ) by f θe 5.5 12: given e k <t ,{s j t−1 } j6=k ,{a j t−1 } j6=k 13: Estimate parameters (μ (φ) k t , Σ (φ) k t ) by g φe 5.14 14: given e k <t ,{s j t−1 } j6=k ,{a j t−1 } j6=k , h k t 15: Compute the gradient of 16: D KL q φ (ztk|·) p θ (ztk|·) w.r.t (θ,φ) 17: given μ (θ) k t , Σ (θ) k t and μ (φ) k t , Σ (φ) k t 18: Sample c e k t ∼N (μ (φ) k t , Σ (φ) k t ;φ e ) 19: Compute the gradient of logp θx x k t d e k ≤t 20: end for 21: end for 22: end for 23: Update (θ,φ) using all gradients with Adam 24: end while where h k t is the encoding of the k-th agent’s behavior x k . Inspired by [49] , we construct this behavior encoding with forward. In forward setting, we only use a forward RNNg forward to pass the information up to time stept into behavior encoding h k t i.e. h k t = g forward (x k ≤t ). The forward setting only use the information in the past, so it is suitable for prediction task at future time step t 0 >T. On the other hand, bi-direction setting uses a bi-directional RNN to capture information from both history and future, i.e. h k t = h g forward (x k ≤t ),g backward (x k t+1:T ) i , which supports 62 system identification and inferring unobserved part from partially observed behaviors. To summarize, we use φ h and φ e to denote parameters related to behavior encoding h and agent vector e respectively and use φ ={φ h ,φ e } to represent the parameter set of the inference network. 5.4.3 Parameter Learning We learn the parameters (θ,φ) of GAMAN by maximizing the ELBOF(θ,φ) in Equation 5.10 over the set of all behavior demonstrationsD. (θ ∗ ,φ ∗ ) = arg max θ,φ X D F(θ,φ) (5.15) We use stochastic backpropagation [46] and ancestral sampling for estimating all gradients w.r.t (θ,φ) and train the networks with Adam optimizer [45]. Algorithm 7.3 shows the overall learning procedure of GAMAN. Note that here we assume all agents share the same behavior generating distribution, i.e., the same set of parameters (θ,φ). No agent role or any agent- specific information was used during training, and the proposed model is trained with randomly shuffled trajectories. However, GAMAN can be easily adapted to a system with known agent types, by simply assigning different behavior generating distribution - different sets of parameters (θ,φ) - for each agent type. Agents could still interact within and across agent types through the attention mechanism introduced above. We will explore the performance of GAMAN on the heterogeneous multi-agent system later in the experiment section. 63 5.5 Experiments We conduct experiments on one simulated physical dataset, Spring-Ball, and two real-world large scale multi-agent datasets, Stanford Drone [72] andNBA [55]. Through experiments, we answer the following questions: (1) How good are the behavioral predictions of our proposed model for multi-agent systems compared to the existing state-of-the-art methods? (2) Are the interactions learned by the proposed attention mechanism helpful for multi-agent behavioral modeling? (3) Is the proposed model able to identify different agent groups and interaction types? In the remainder of this section, we illustrate the datasets, tasks, quantitative results, and interpretations to answer the questions above. 5.5.1 Datasets Spring-Ball We design a spring-ball system following [47]. In this scenario, balls are bouncing inside a 2D square container of size L(= 10) governed by Hooke’s law. N(= 100) balls with mass ratio of 0.2:1:5 generated at the probability of 0.3:0.4:0.3 are randomly connected by {no, soft, hard} springs at the probability of 0.8:0.1:0.1, where the elastic coefficient of soft and hard springs is 0.5:2. The static lengths of all springs are arbitrarily set to 1. The balls are initialized with random positions and velocities. Balls collide with the walls governed by the laws of elastic collisions as balls are viewed as mass points. The trajectories are simulated at 1000 FPS, and we sample the data every 100 frames. We generated 1000 cases, each has 49 frames. 64 Table 5.1: Statistics of the processed Stanford Drone dataset. Scenes # of Cases # of Bikers # of Pedestrians # of Cars gates 93 716 652 19 nexus 240 108 2182 1316 bookstore 191 733 1784 15 deathCircle 110 1517 1174 189 hyang 234 1004 2649 15 Total 868 4078 8441 1554 Stanford Drone This dataset consists of multiple aerial videos of 8 unique scenes captured around the Stanford campus, and there are six types of agents (e.g., biker, car) navigating the crowded spaces with dynamic interactions. For experiments, we choose the top 3 types of agents, Biker, Pedestrian, and Car from the dataset, and we use all the videos except the ones in scene little, coupa, and quad, since the selected types of agents are much more sparse in these 3 scenes. The original frame rate of these videos is around 29.97 FPS, and we down-sampled all the videos to 5 FPS. Besides, we divide each video into 12-second long (i.e., 60 frames) distinct sub-videos along the timeline, and we treat each sub-video as one multi-agent case. We only keep the cases with the number of agents between 10 and 90. Table 5.1 shows the statistics of the processed dataset. NBA This dataset is composed of 49,628 12-second sequences of the 2D basketball players and ball overhead-view point trajectories from 200 games in the 15/16 NBA season captured by SportVU Player Tracking technology. It tracks the positions of each player and collects data every 40ms. We down-sampled all the data to 5 FPS. To reduce the uncertainty of the trajectories, we align the sequences and ensure the offense always shoots toward the same side. 65 5.5.2 Experimental Design Prediction Task For a multi-agent case with T frames, given its first L frames (of all agents), one needs to predict each agent’s future behaviour in the last T−L frames. For our datasets, the (T,L) pairs are: Spring-ball, (49, 35); NBA, (60, 50); Stanford Drone, (60, 50). Evaluation Metrics Given we consider the agent’s trajectory as its behavior, we evaluate the behavior forecasting with four different metrics: • L 2 distance (L2): The L 2 distance between predicted trajectories and the ground truth, averaged over each predicted frame for each agent. • Maximum L 2 distance (maxL2): The maximum L 2 distance between the prediction and ground truth throughout a predicted trajectory, averaged over each agent. • Miss rate: The fraction of trajectories whose distance between the final pre- dicted and ground-truth point pairs exceeds a constant number x. We choose x as 0.2 and 0.5, noted by MR0.2 and MR0.5 respectively. Baselines We compare our with multiple baseline models in the trajectory fore- casting task. To demonstrate the advantage of attention mechanism and learned interactions in , we remove all interactions from for ablation comparisons. • Extrapolation and linear method: – Velocity: a velocity-based extrapolation. 66 – KF: Kalman Filter. • Deep neural networks based models: – LSTM: Long Short Term Memory Network [32] – SocialGAN: Socially Acceptable Trajectories with Generative Adversar- ial Networks [27] – SocialLSTM: Social Pooling with LSTMs [1] – NRI: Neural Relational Inference Model [47] – IN: Interaction Networks [3] – GAMAN-NI: with No Interaction. Table 5.2: Behavior prediction results on 3 datasets. Spring-Ball Stanford Drone NBA L2 maxL2 MR0.2 MR0.5 L2 maxL2 MR0.2 MR0.5 L2 maxL2 MR0.2 MR0.5 Velocity 0.3786 0.9338 0.6778 0.4126 0.0214 0.0436 0.0380 0.0049 0.2470 0.5337 0.7745 0.3885 KF 1.0616 1.8522 0.9500 0.7938 0.2482 0.4790 0.5900 0.1835 0.6566 1.1420 0.9385 0.7303 LSTM 0.1839 0.4108 0.5440 0.2662 0.0249 0.0478 0.0323 0.0041 0.1823 0.3636 0.6439 0.2107 SocialGAN 0.1952 0.4549 0.6706 0.2989 0.0231 0.0453 0.0384 0.0038 0.2495 0.5031 0.7954 0.3873 SocialLSTM 0.6763 1.2372 0.9628 0.8129 0.0702 0.1239 0.0813 0.0062 0.4995 0.8994 0.9460 0.6998 NRI 0.2311 0.5351 0.6541 0.3529 0.0291 0.0596 0.0382 0.0033 0.1725 0.3647 0.6415 0.2222 IN 0.2657 0.6323 0.7324 0.4212 0.0257 0.0606 0.0285 0.0031 0.1855 0.4114 0.7444 0.2639 GAMAN-NI 0.2624 0.5968 0.5371 0.4603 0.0264 0.0572 0.0349 0.0052 0.2465 0.5983 0.6428 0.4371 GAMAN 0.1728 0.4049 0.4431 0.2885 0.0203 0.0416 0.0335 0.0047 0.1720 0.3794 0.6089 0.2023 Implementation details For the generation network in , we use multivariate Gaussian with diagonal covariance for both transition distribution π θe and emission distribution π θx . f θe is parameterized by GRU. f θx is parameterized by a 3-layer MLP with ReLU [66] activations followed by a linear output layer. Both f θs and f θa are parameterized by 3-layer MLPs with ReLU activations. For the inference network, we also use multivariate Gaussian distribution with diagonal covariance 67 Table 5.3: Log-likelihood of generative models on 3 datasets. SocialLSTM NRI GAMAN-NI GAMAN Spring-Ball -6.25 -6.40 -0.75 1.21 Stanford Drone 1.34 4.51 4.04 6.12 NBA -9.54 0.51 -1.1 0.80 for π φe which is parameterized by GRU, and f θs and f θa are inherited from the generation network. All sizes of hidden layers are 32 and the model is optimized with Adam [45]. We implement KF model using the pykalman [17]. For other models compared in experiments, we keep the original network structure and use a similar amount of parameters for a fair comparison. We randomly split each dataset into the training/validation/test set with the ratio of 7:1:2, choose the best model weights on the validation set, and report the performance on the test set. 5.5.3 Quantitative Results Prediction Table 5.2 shows the behavior forecasting results on these 3 datasets. We observe that, in most cases, our proposed outperforms all the competitive baselines in terms of 4 metrics and achieves the best performance. Further, there is significant improvement from -NI to , which demonstrates that the proposed attention mechanism can efficiently capture the interactions in multi-agent system and provide better behavior forecasting results. Inference Meanwhile, we compare the log-likelihood of generative models in Table 5.3. Models with higher log-likelihood value indicate they are fitted tighter on 68 datasets. Our proposed excels all generative models with the highest log-likelihood values on all 3 datasets. Combined with the performance of prediction and inference, it is certain that not only provides more accurate predictions but also offers preferable inference. 5.5.4 Interaction Analysis To further understand the interpretability of , especially agent vectors e t and interactions s t , we conduct a case study on 3 datasets. We use weighted interaction vectors s k t to get the pairwise relational interaction vectors (e.g. pairwise distance or other metrics) and visualize the relational interaction vectors. To acquire a reasonable visualization of the learned vectors, we projected vectors on the 2D plane using T-SNE [58]. Figure 5.3 shows several concrete examples of learned vectors. The same group of agents and the same type of interactions share the same colors. For Spring-Ball, we randomly choose 1000 frames from the test set and get the corresponding vectors. Figure 5.5.4 and Figure 5.5.4 show the clusters of 3 groups of agents (balls) and 3 types of interactions (springs) among balls. Moreover, in Stanford Drone, we randomly select 500 frames from the test set. Figure 5.5.4 and Figure 5.5.4 show the clusters of 3 groups of agents (e.g., bikers) and 3 types of interactions respectively. Finally, we randomly choose 500 frames from test set on NBA, Figure 5.5.4 and Figure 5.5.4 indicate 3 groups of agents (balls, offense and defense) and 3 types of interactions (ball-players, collaborations and hostilities). Conceivably, agent groups and interaction types identified by GAMAN are reasonable and convictive, which demonstrates its capability to handle heterogeneous systems and provide interpretable representations. 69 Figure 5.3: Visualization of learned agent vectors e t (left column) and interaction vectors s t (right column) on three datasets, indicating different agent groups and interaction types. 5.6 Summary We proposed the Generative Attentional Multi-Agent Network (GAMAN) — a novel deep generative model designed for multi-agent systems, which approximates the underlying behavior generating process with a latent state-space model and integrated with attention mechanism to capture the underlying data generating 70 process of multi-agent time series. Empirically we showed that GAMAN outperforms existing models on counterfactual inference and correctly identifies agent groups and interaction types across multi-agent time series in different domains. 71 Chapter 6 Counterfactual Inference for Time Series with Mixed Sampling Rates In this chapter, we focus on time series counterfactual inference with mixed samplingrates. Namely, theco-evolvingtimeseriesisobservedatdifferentfrequencies. We also refer to the time series with a high sampling rate as high-resolution time series while time series with a low sampling rate as low-resolution time series. Here we proposed a deep generative model, which captures the underlying data generating mechanism and supports counterfactual inference for time series with mixed sampling rates. The proposed model is based on a latent hierarchical structure with a learnable switch mechanism to capture the temporal dependencies of mixed-rate time series. Experimental results on two real-world datasets demonstrate its capability on both forecasting and counterfactual inference tasks. 6.1 A Motivating Example Consider the electronic healthcare records of a patient, which consists of observations from multiple sources characterized by various sampling rates. For example, vital signs such as heart rate are sampled every second, some lab test results are measured every few days, while other tests may be taken every few weeks. Given all the historical observations x ≤t , and the potential future progress on high-resolution 72 time series x h >t , the doctor would like to know the future progress of low-resolution time series x l >t . In this case, the counterfactual factor is the high-resolution time series x h >t . Since both time series are sort of reflections of a patient’s health condition, we would like to solve this counterfactual inference problem in a first principled way by modeling the underlying progress of a patient’s health condition. In another word, we want to build a generative model capturing the underlying data generating mechanism of mixed-rate time series which support counterfactual inference. 6.2 Related Work State-space models such as Kalman filters (KF) [44], and hidden Markov models (HMMs) [70] have been widely used in various time series applications such as speech recognition [70], atmospheric monitoring [37], and robotic control [67]. These approaches successfully model regularly sampled (i.e. sampled at the same frequency/rate) time series data, however, they cannot be directly used for MR- MTS as they cannot simultaneously capture the multiple temporal dependencies present in MR-MTS. To handle MR-MTS with state-space models, researchers have extended KF models and proposed multi-rate Kalman filters (MR-KF) [2, 83]. MR-KF approaches either fuse the data with different sampling rates or fuse the estimates for KFs trained on each sampling rate. Many of these MR-KF approaches aim to improve the estimates for the highest sampled rate data and do not focus on capturing the multiple temporal dependencies present in MR-MTS. Moreover, the linear transition and emission functionality of the MR-KF models limits their usability on complex real-world data. 73 Recently, researchers have resorted to deep learning models [10, 22, 48] to model the non-linear temporal dynamics of real-world and sequential data. Discriminative models such as hierarchical recurrent neural network [19], hierarchical multiscale recurrent neural network (HM-RNN) [10], and phased long short-term memory (PLSTM) [68] have been proposed to capture temporal dependencies of sequential data. However, these discriminative models do not capture the underlying data generation process and therefore are not suited for forecasting and interpolation tasks. Deep generative models [22, 48, 71] have been developed to model the data generation process of the complex time series data. Krishnan et al. [48] proposed deep Kalman filter, a nonlinear state-space model, by marrying the ideas of deep neural networks with Kalman filters. Fraccaro et al. [21] introduced a stochastic recurrent neural network (SRNN) which glued a RNN with a state-space model together to form a stochastic and sequential neural generative model. Even though these deep generative models are the state-of-the-art approaches to obtain the underlying data generation process, they are not designed to capture all the temporal dependencies of MR-MTS. None of the existing deep learning models or state-space models can be directly used for modeling MR-MTS. Thus, in this work, we develop a deep generative model which leverages the properties of the above discriminative and generative models, to model the data generation process of MR-MTS while also capturing the multiple temporal dependencies using a latent hierarchical structure. 6.3 Multi-rate Hierarchical Deep Markov Model In this section, we present our proposed Multi Rate-Hierarchical Deep Markov Model (MR-HDMM). We first clarify the notations and definitions used in this paper. 74 Auxiliary connections Latent variable z <latexit sha1_base64="WIlbTbBFLLcqOvt81zBc03GagJU=">AAAB53icbVBNS8NAEJ3Ur1q/qh69LBbBU0lFUG9FLx5bMLbQhrLZTtq1m03Y3Qg19Bd48aDi1b/kzX/jts1BWx8MPN6bYWZekAiujet+O4WV1bX1jeJmaWt7Z3evvH9wr+NUMfRYLGLVDqhGwSV6 hhuB7UQhjQKBrWB0M/Vbj6g0j+WdGSfoR3QgecgZNVZqPvXKFbfqzkCWSS0nFcjR6JW/uv2YpRFKwwTVulNzE+NnVBnOBE5K3VRjQtmIDrBjqaQRaj+bHTohJ1bpkzBWtqQhM/X3REYjrcdRYDsjaoZ60ZuK/3md1ISXfsZlkhqUbL4oTAUxMZl+TfpcITNibAllittbCRtSRZmx2ZRsCLXFl5eJd1a9qrrN80r9Ok+jCEdwDKdQgwuowy00wAMGCM/wCm/Og/PivDsf89aCk88cwh84nz9XU4zR</latexit> <latexit sha1_base64="WIlbTbBFLLcqOvt81zBc03GagJU=">AAAB53icbVBNS8NAEJ3Ur1q/qh69LBbBU0lFUG9FLx5bMLbQhrLZTtq1m03Y3Qg19Bd48aDi1b/kzX/jts1BWx8MPN6bYWZekAiujet+O4WV1bX1jeJmaWt7Z3evvH9wr+NUMfRYLGLVDqhGwSV6 hhuB7UQhjQKBrWB0M/Vbj6g0j+WdGSfoR3QgecgZNVZqPvXKFbfqzkCWSS0nFcjR6JW/uv2YpRFKwwTVulNzE+NnVBnOBE5K3VRjQtmIDrBjqaQRaj+bHTohJ1bpkzBWtqQhM/X3REYjrcdRYDsjaoZ60ZuK/3md1ISXfsZlkhqUbL4oTAUxMZl+TfpcITNibAllittbCRtSRZmx2ZRsCLXFl5eJd1a9qrrN80r9Ok+jCEdwDKdQgwuowy00wAMGCM/wCm/Og/PivDsf89aCk88cwh84nz9XU4zR</latexit> <latexit sha1_base64="WIlbTbBFLLcqOvt81zBc03GagJU=">AAAB53icbVBNS8NAEJ3Ur1q/qh69LBbBU0lFUG9FLx5bMLbQhrLZTtq1m03Y3Qg19Bd48aDi1b/kzX/jts1BWx8MPN6bYWZekAiujet+O4WV1bX1jeJmaWt7Z3evvH9wr+NUMfRYLGLVDqhGwSV6 hhuB7UQhjQKBrWB0M/Vbj6g0j+WdGSfoR3QgecgZNVZqPvXKFbfqzkCWSS0nFcjR6JW/uv2YpRFKwwTVulNzE+NnVBnOBE5K3VRjQtmIDrBjqaQRaj+bHTohJ1bpkzBWtqQhM/X3REYjrcdRYDsjaoZ60ZuK/3md1ISXfsZlkhqUbL4oTAUxMZl+TfpcITNibAllittbCRtSRZmx2ZRsCLXFl5eJd1a9qrrN80r9Ok+jCEdwDKdQgwuowy00wAMGCM/wCm/Og/PivDsf89aCk88cwh84nz9XU4zR</latexit> Observation x Inference RNN h <latexit sha1_base64="lt6MdrejMYgtNBMJ4G/0FERlp4M=">AAAB53icbVBNS8NAEJ34WetX1aOXxSJ4KokI6q3oxWMLxhbaUDbbSbt2swm7G6GE/gIvHlS8+pe8+W/ctjlo64OBx3szzMwLU8G1cd1vZ2V1bX1js7RV3t7Z3duvHBw+6CRTDH2WiES1Q6pRcIm+4UZgO1VI41BgKxzdTv3WEyrNE3lvxikGMR1IHnFGjZWaw16l6tbcGcgy8QpShQKNXuWr209YFqM0TFCtO56bmiCnynAmcFLuZhpTykZ0gB1LJY1RB/ns0Ak5tUqfRImyJQ2Zqb8nchprPY5D2xlTM9SL3lT8z+tkJroKci7TzKBk80VRJohJyPRr0ucKmRFjSyhT3N5K 2JAqyozNpmxD8BZfXib+ee265jYvqvWbIo0SHMMJnIEHl1CHO2iADwwQnuEV3pxH58V5dz7mrStOMXMEf+B8/gA8HYy/</latexit> <latexit sha1_base64="lt6MdrejMYgtNBMJ4G/0FERlp4M=">AAAB53icbVBNS8NAEJ34WetX1aOXxSJ4KokI6q3oxWMLxhbaUDbbSbt2swm7G6GE/gIvHlS8+pe8+W/ctjlo64OBx3szzMwLU8G1cd1vZ2V1bX1js7RV3t7Z3duvHBw+6CRTDH2WiES1Q6pRcIm+4UZgO1VI41BgKxzdTv3WEyrNE3lvxikGMR1IHnFGjZWaw16l6tbcGcgy8QpShQKNXuWr209YFqM0TFCtO56bmiCnynAmcFLuZhpTykZ0gB1LJY1RB/ns0Ak5tUqfRImyJQ2Zqb8nchprPY5D2xlTM9SL3lT8z+tkJroKci7TzKBk80VRJohJyPRr0ucKmRFjSyhT3N5K 2JAqyozNpmxD8BZfXib+ee265jYvqvWbIo0SHMMJnIEHl1CHO2iADwwQnuEV3pxH58V5dz7mrStOMXMEf+B8/gA8HYy/</latexit> <latexit sha1_base64="lt6MdrejMYgtNBMJ4G/0FERlp4M=">AAAB53icbVBNS8NAEJ34WetX1aOXxSJ4KokI6q3oxWMLxhbaUDbbSbt2swm7G6GE/gIvHlS8+pe8+W/ctjlo64OBx3szzMwLU8G1cd1vZ2V1bX1js7RV3t7Z3duvHBw+6CRTDH2WiES1Q6pRcIm+4UZgO1VI41BgKxzdTv3WEyrNE3lvxikGMR1IHnFGjZWaw16l6tbcGcgy8QpShQKNXuWr209YFqM0TFCtO56bmiCnynAmcFLuZhpTykZ0gB1LJY1RB/ns0Ak5tUqfRImyJQ2Zqb8nchprPY5D2xlTM9SL3lT8z+tkJroKci7TzKBk80VRJohJyPRr0ucKmRFjSyhT3N5K 2JAqyozNpmxD8BZfXib+ee265jYvqvWbIo0SHMMJnIEHl1CHO2iADwwQnuEV3pxH58V5dz7mrStOMXMEf+B8/gA8HYy/</latexit> Switches s z 3 5 <latexit sha1_base64="xo/E3ptk+QaeVilWfelxQwJBNeI=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSx+RL0RvXjExBUMrKRbutDQdjdt1wQ3+yu8eFDj1b/jzX9jgT0o+JJJXt6bycy8IOZMG9f9dgoLi0vLK8XV0tr6xuZWeXvnTkeJItQjEY9UK8CaciapZ5jhtBUrikXAaTMYXo395iNVmkXy1oxi6gvclyxkBBsr3T9109PsIT3OuuWKW3UnQPOklpMK5Gh0y1+dXkQSQaUhHGvdrrmx8VOsDCOcZqVOommMyRD3adtSiQXVfjo5OEMHVumhMFK2pEET9fdEioXWIxHYToHNQM96Y/E/r52Y8NxPmYwTQyWZLgoTjkyExt+jHlOUGD6yBBPF7K2IDLDCxNiMSjaE2uzL88Q7ql5U3ZuTSv0yT6MIe7APh1CDM6jDNTTAAwICnuEV3hzlvDjvzse0teDkM7vwB87nDzZHkDY=</latexit> <latexit sha1_base64="xo/E3ptk+QaeVilWfelxQwJBNeI=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSx+RL0RvXjExBUMrKRbutDQdjdt1wQ3+yu8eFDj1b/jzX9jgT0o+JJJXt6bycy8IOZMG9f9dgoLi0vLK8XV0tr6xuZWeXvnTkeJItQjEY9UK8CaciapZ5jhtBUrikXAaTMYXo395iNVmkXy1oxi6gvclyxkBBsr3T9109PsIT3OuuWKW3UnQPOklpMK5Gh0y1+dXkQSQaUhHGvdrrmx8VOsDCOcZqVOommMyRD3adtSiQXVfjo5OEMHVumhMFK2pEET9fdEioXWIxHYToHNQM96Y/E/r52Y8NxPmYwTQyWZLgoTjkyExt+jHlOUGD6yBBPF7K2IDLDCxNiMSjaE2uzL88Q7ql5U3ZuTSv0yT6MIe7APh1CDM6jDNTTAAwICnuEV3hzlvDjvzse0teDkM7vwB87nDzZHkDY=</latexit> <latexit sha1_base64="xo/E3ptk+QaeVilWfelxQwJBNeI=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSx+RL0RvXjExBUMrKRbutDQdjdt1wQ3+yu8eFDj1b/jzX9jgT0o+JJJXt6bycy8IOZMG9f9dgoLi0vLK8XV0tr6xuZWeXvnTkeJItQjEY9UK8CaciapZ5jhtBUrikXAaTMYXo395iNVmkXy1oxi6gvclyxkBBsr3T9109PsIT3OuuWKW3UnQPOklpMK5Gh0y1+dXkQSQaUhHGvdrrmx8VOsDCOcZqVOommMyRD3adtSiQXVfjo5OEMHVumhMFK2pEET9fdEioXWIxHYToHNQM96Y/E/r52Y8NxPmYwTQyWZLgoTjkyExt+jHlOUGD6yBBPF7K2IDLDCxNiMSjaE2uzL88Q7ql5U3ZuTSv0yT6MIe7APh1CDM6jDNTTAAwICnuEV3hzlvDjvzse0teDkM7vwB87nDzZHkDY=</latexit> z 3 4 z 3 3 <latexit sha1_base64="i1uYpfDc4ed45wYwxlYYKFN7NCc=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiewqiXojevGIiSsYWEm3dKGh7W7arglu+BVePKjx6t/x5r+xwB4UfMkkL+/NZGZemHCmjet+O4Wl5ZXVteJ6aWNza3unvLt3p+NUEeqTmMeqFWJNOZPU N8xw2koUxSLktBkOryZ+85EqzWJ5a0YJDQTuSxYxgo2V7p+62en4wVa3XHGr7hRokXg5qUCORrf81enFJBVUGsKx1m3PTUyQYWUY4XRc6qSaJpgMcZ+2LZVYUB1k04PH6MgqPRTFypY0aKr+nsiw0HokQtspsBnoeW8i/ue1UxOdBxmTSWqoJLNFUcqRidHke9RjihLD R5Zgopi9FZEBVpgYm1HJhuDNv7xI/JPqRdW9qVXql3kaRTiAQzgGD86gDtfQAB8ICHiGV3hzlPPivDsfs9aCk8/swx84nz8zN5A0</latexit> <latexit sha1_base64="i1uYpfDc4ed45wYwxlYYKFN7NCc=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiewqiXojevGIiSsYWEm3dKGh7W7arglu+BVePKjx6t/x5r+xwB4UfMkkL+/NZGZemHCmjet+O4Wl5ZXVteJ6aWNza3unvLt3p+NUEeqTmMeqFWJNOZPU N8xw2koUxSLktBkOryZ+85EqzWJ5a0YJDQTuSxYxgo2V7p+62en4wVa3XHGr7hRokXg5qUCORrf81enFJBVUGsKx1m3PTUyQYWUY4XRc6qSaJpgMcZ+2LZVYUB1k04PH6MgqPRTFypY0aKr+nsiw0HokQtspsBnoeW8i/ue1UxOdBxmTSWqoJLNFUcqRidHke9RjihLD R5Zgopi9FZEBVpgYm1HJhuDNv7xI/JPqRdW9qVXql3kaRTiAQzgGD86gDtfQAB8ICHiGV3hzlPPivDsfs9aCk8/swx84nz8zN5A0</latexit> <latexit sha1_base64="i1uYpfDc4ed45wYwxlYYKFN7NCc=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiewqiXojevGIiSsYWEm3dKGh7W7arglu+BVePKjx6t/x5r+xwB4UfMkkL+/NZGZemHCmjet+O4Wl5ZXVteJ6aWNza3unvLt3p+NUEeqTmMeqFWJNOZPU N8xw2koUxSLktBkOryZ+85EqzWJ5a0YJDQTuSxYxgo2V7p+62en4wVa3XHGr7hRokXg5qUCORrf81enFJBVUGsKx1m3PTUyQYWUY4XRc6qSaJpgMcZ+2LZVYUB1k04PH6MgqPRTFypY0aKr+nsiw0HokQtspsBnoeW8i/ue1UxOdBxmTSWqoJLNFUcqRidHke9RjihLD R5Zgopi9FZEBVpgYm1HJhuDNv7xI/JPqRdW9qVXql3kaRTiAQzgGD86gDtfQAB8ICHiGV3hzlPPivDsfs9aCk8/swx84nz8zN5A0</latexit> z 3 2 z 3 1 z 2 1 <latexit sha1_base64="7dezeGnBPLFz/n9GvV1gREx7vPs=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KkkR1FvRi8cKxlbaWDbbTbt0dxN2N0IN+RVePKh49e9489+4bXPQ1gcDj/dmmJkXJpxp47rfztLyyuraemmjvLm1vbNb2du/03GqCPVJzGPVDrGmnEnq G2Y4bSeKYhFy2gpHVxO/9UiVZrG8NeOEBgIPJIsYwcZK90+9zMsfsnreq1TdmjsFWiReQapQoNmrfHX7MUkFlYZwrHXHcxMTZFgZRjjNy91U0wSTER7QjqUSC6qDbHpwjo6t0kdRrGxJg6bq74kMC63HIrSdApuhnvcm4n9eJzXReZAxmaSGSjJbFKUcmRhNvkd9pigxfGwJJorZWxEZYoWJsRmVbQje/MuLxK/XLmruzWm1cVmkUYJDOIIT8OAMGnANTfCBgIBneIU3RzkvzrvzMWtdcoqZA/gD5/MHLqOQMQ==</latexit> <latexit sha1_base64="7dezeGnBPLFz/n9GvV1gREx7vPs=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KkkR1FvRi8cKxlbaWDbbTbt0dxN2N0IN+RVePKh49e9489+4bXPQ1gcDj/dmmJkXJpxp47rfztLyyuraemmjvLm1vbNb2du/03GqCPVJzGPVDrGmnEnq G2Y4bSeKYhFy2gpHVxO/9UiVZrG8NeOEBgIPJIsYwcZK90+9zMsfsnreq1TdmjsFWiReQapQoNmrfHX7MUkFlYZwrHXHcxMTZFgZRjjNy91U0wSTER7QjqUSC6qDbHpwjo6t0kdRrGxJg6bq74kMC63HIrSdApuhnvcm4n9eJzXReZAxmaSGSjJbFKUcmRhNvkd9pigxfGwJJorZWxEZYoWJsRmVbQje/MuLxK/XLmruzWm1cVmkUYJDOIIT8OAMGnANTfCBgIBneIU3RzkvzrvzMWtdcoqZA/gD5/MHLqOQMQ==</latexit> <latexit sha1_base64="7dezeGnBPLFz/n9GvV1gREx7vPs=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KkkR1FvRi8cKxlbaWDbbTbt0dxN2N0IN+RVePKh49e9489+4bXPQ1gcDj/dmmJkXJpxp47rfztLyyuraemmjvLm1vbNb2du/03GqCPVJzGPVDrGmnEnq G2Y4bSeKYhFy2gpHVxO/9UiVZrG8NeOEBgIPJIsYwcZK90+9zMsfsnreq1TdmjsFWiReQapQoNmrfHX7MUkFlYZwrHXHcxMTZFgZRjjNy91U0wSTER7QjqUSC6qDbHpwjo6t0kdRrGxJg6bq74kMC63HIrSdApuhnvcm4n9eJzXReZAxmaSGSjJbFKUcmRhNvkd9pigxfGwJJorZWxEZYoWJsRmVbQje/MuLxK/XLmruzWm1cVmkUYJDOIIT8OAMGnANTfCBgIBneIU3RzkvzrvzMWtdcoqZA/gD5/MHLqOQMQ==</latexit> z 2 2 z 2 3 <latexit sha1_base64="9XdrESdfp8cYMi1TvEaWZTfR8F0=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSxoot6IXjxi4goGVtItXWhou5u2a4Kb/RVePKjx6t/x5r+xwB4UfMkkL+/NZGZeEHOmjet+O4Wl5ZXVteJ6aWNza3unvLt3p6NEEeqRiEeqHWBNOZPU M8xw2o4VxSLgtBWMriZ+65EqzSJ5a8Yx9QUeSBYygo2V7p966Un2kNazXrniVt0p0CKp5aQCOZq98le3H5FEUGkIx1p3am5s/BQrwwinWambaBpjMsID2rFUYkG1n04PztCRVfoojJQtadBU/T2RYqH1WAS2U2Az1PPeRPzP6yQmPPdTJuPEUElmi8KEIxOhyfeozxQl ho8twUQxeysiQ6wwMTajkg2hNv/yIvHq1Yuqe3NaaVzmaRThAA7hGGpwBg24hiZ4QEDAM7zCm6OcF+fd+Zi1Fpx8Zh/+wPn8ATGzkDM=</latexit> <latexit sha1_base64="9XdrESdfp8cYMi1TvEaWZTfR8F0=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSxoot6IXjxi4goGVtItXWhou5u2a4Kb/RVePKjx6t/x5r+xwB4UfMkkL+/NZGZeEHOmjet+O4Wl5ZXVteJ6aWNza3unvLt3p6NEEeqRiEeqHWBNOZPU M8xw2o4VxSLgtBWMriZ+65EqzSJ5a8Yx9QUeSBYygo2V7p966Un2kNazXrniVt0p0CKp5aQCOZq98le3H5FEUGkIx1p3am5s/BQrwwinWambaBpjMsID2rFUYkG1n04PztCRVfoojJQtadBU/T2RYqH1WAS2U2Az1PPeRPzP6yQmPPdTJuPEUElmi8KEIxOhyfeozxQl ho8twUQxeysiQ6wwMTajkg2hNv/yIvHq1Yuqe3NaaVzmaRThAA7hGGpwBg24hiZ4QEDAM7zCm6OcF+fd+Zi1Fpx8Zh/+wPn8ATGzkDM=</latexit> <latexit sha1_base64="9XdrESdfp8cYMi1TvEaWZTfR8F0=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSxoot6IXjxi4goGVtItXWhou5u2a4Kb/RVePKjx6t/x5r+xwB4UfMkkL+/NZGZeEHOmjet+O4Wl5ZXVteJ6aWNza3unvLt3p6NEEeqRiEeqHWBNOZPU M8xw2o4VxSLgtBWMriZ+65EqzSJ5a8Yx9QUeSBYygo2V7p966Un2kNazXrniVt0p0CKp5aQCOZq98le3H5FEUGkIx1p3am5s/BQrwwinWambaBpjMsID2rFUYkG1n04PztCRVfoojJQtadBU/T2RYqH1WAS2U2Az1PPeRPzP6yQmPPdTJuPEUElmi8KEIxOhyfeozxQl ho8twUQxeysiQ6wwMTajkg2hNv/yIvHq1Yuqe3NaaVzmaRThAA7hGGpwBg24hiZ4QEDAM7zCm6OcF+fd+Zi1Fpx8Zh/+wPn8ATGzkDM=</latexit> z 2 4 z 2 5 z 1 5 z 1 4 z 1 3 z 1 2 z 1 1 x 1 1 x 1 2 x 1 3 <latexit sha1_base64="wgr+t9mZrUhwY7xByoNJ4NNScJg=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokK6q3oxWMFYyttLJvtpl26uwm7G7GE/AovHlS8+ne8+W/ctjlo64OBx3szzMwLE860cd1vZ2FxaXlltbRWXt/Y3Nqu7Oze6ThVhPok5rFqhVhTziT1 DTOcthJFsQg5bYbDq7HffKRKs1jemlFCA4H7kkWMYGOl+6dudpI/ZF7erVTdmjsBmideQapQoNGtfHV6MUkFlYZwrHXbcxMTZFgZRjjNy51U0wSTIe7TtqUSC6qDbHJwjg6t0kNRrGxJgybq74kMC61HIrSdApuBnvXG4n9eOzXReZAxmaSGSjJdFKUcmRiNv0c9pigx fGQJJorZWxEZYIWJsRmVbQje7MvzxD+uXdTcm9Nq/bJIowT7cABH4MEZ1OEaGuADAQHP8ApvjnJenHfnY9q64BQze/AHzucPLRmQMA==</latexit> <latexit sha1_base64="wgr+t9mZrUhwY7xByoNJ4NNScJg=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokK6q3oxWMFYyttLJvtpl26uwm7G7GE/AovHlS8+ne8+W/ctjlo64OBx3szzMwLE860cd1vZ2FxaXlltbRWXt/Y3Nqu7Oze6ThVhPok5rFqhVhTziT1 DTOcthJFsQg5bYbDq7HffKRKs1jemlFCA4H7kkWMYGOl+6dudpI/ZF7erVTdmjsBmideQapQoNGtfHV6MUkFlYZwrHXbcxMTZFgZRjjNy51U0wSTIe7TtqUSC6qDbHJwjg6t0kNRrGxJgybq74kMC61HIrSdApuBnvXG4n9eOzXReZAxmaSGSjJdFKUcmRiNv0c9pigx fGQJJorZWxEZYIWJsRmVbQje7MvzxD+uXdTcm9Nq/bJIowT7cABH4MEZ1OEaGuADAQHP8ApvjnJenHfnY9q64BQze/AHzucPLRmQMA==</latexit> <latexit sha1_base64="wgr+t9mZrUhwY7xByoNJ4NNScJg=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokK6q3oxWMFYyttLJvtpl26uwm7G7GE/AovHlS8+ne8+W/ctjlo64OBx3szzMwLE860cd1vZ2FxaXlltbRWXt/Y3Nqu7Oze6ThVhPok5rFqhVhTziT1 DTOcthJFsQg5bYbDq7HffKRKs1jemlFCA4H7kkWMYGOl+6dudpI/ZF7erVTdmjsBmideQapQoNGtfHV6MUkFlYZwrHXbcxMTZFgZRjjNy51U0wSTIe7TtqUSC6qDbHJwjg6t0kNRrGxJgybq74kMC61HIrSdApuBnvXG4n9eOzXReZAxmaSGSjJdFKUcmRiNv0c9pigx fGQJJorZWxEZYIWJsRmVbQje7MvzxD+uXdTcm9Nq/bJIowT7cABH4MEZ1OEaGuADAQHP8ApvjnJenHfnY9q64BQze/AHzucPLRmQMA==</latexit> x 1 4 x 1 5 x 2 5 x 3 5 x 3 1 x 2 1 x 2 3 <latexit sha1_base64="mLnVqBjNXd0UGID/U0T8/vXabZE=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSxoot6IXjxi4goGVtItXWhou5u2aySb/RVePKjx6t/x5r+xwB4UfMkkL+/NZGZeEHOmjet+O4Wl5ZXVteJ6aWNza3unvLt3p6NEEeqRiEeqHWBNOZPU M8xw2o4VxSLgtBWMriZ+65EqzSJ5a8Yx9QUeSBYygo2V7p966Un2kNazXrniVt0p0CKp5aQCOZq98le3H5FEUGkIx1p3am5s/BQrwwinWambaBpjMsID2rFUYkG1n04PztCRVfoojJQtadBU/T2RYqH1WAS2U2Az1PPeRPzP6yQmPPdTJuPEUElmi8KEIxOhyfeozxQl ho8twUQxeysiQ6wwMTajkg2hNv/yIvHq1Yuqe3NaaVzmaRThAA7hGGpwBg24hiZ4QEDAM7zCm6OcF+fd+Zi1Fpx8Zh/+wPn8AS6dkDE=</latexit> <latexit sha1_base64="mLnVqBjNXd0UGID/U0T8/vXabZE=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSxoot6IXjxi4goGVtItXWhou5u2aySb/RVePKjx6t/x5r+xwB4UfMkkL+/NZGZeEHOmjet+O4Wl5ZXVteJ6aWNza3unvLt3p6NEEeqRiEeqHWBNOZPU M8xw2o4VxSLgtBWMriZ+65EqzSJ5a8Yx9QUeSBYygo2V7p966Un2kNazXrniVt0p0CKp5aQCOZq98le3H5FEUGkIx1p3am5s/BQrwwinWambaBpjMsID2rFUYkG1n04PztCRVfoojJQtadBU/T2RYqH1WAS2U2Az1PPeRPzP6yQmPPdTJuPEUElmi8KEIxOhyfeozxQl ho8twUQxeysiQ6wwMTajkg2hNv/yIvHq1Yuqe3NaaVzmaRThAA7hGGpwBg24hiZ4QEDAM7zCm6OcF+fd+Zi1Fpx8Zh/+wPn8AS6dkDE=</latexit> <latexit sha1_base64="mLnVqBjNXd0UGID/U0T8/vXabZE=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiSxoot6IXjxi4goGVtItXWhou5u2aySb/RVePKjx6t/x5r+xwB4UfMkkL+/NZGZeEHOmjet+O4Wl5ZXVteJ6aWNza3unvLt3p6NEEeqRiEeqHWBNOZPU M8xw2o4VxSLgtBWMriZ+65EqzSJ5a8Yx9QUeSBYygo2V7p966Un2kNazXrniVt0p0CKp5aQCOZq98le3H5FEUGkIx1p3am5s/BQrwwinWambaBpjMsID2rFUYkG1n04PztCRVfoojJQtadBU/T2RYqH1WAS2U2Az1PPeRPzP6yQmPPdTJuPEUElmi8KEIxOhyfeozxQl ho8twUQxeysiQ6wwMTajkg2hNv/yIvHq1Yuqe3NaaVzmaRThAA7hGGpwBg24hiZ4QEDAM7zCm6OcF+fd+Zi1Fpx8Zh/+wPn8AS6dkDE=</latexit> z 3 5 z 3 4 <latexit sha1_base64="Ddk1wsUUpdGdorNP0fQBk4UuLfw=">AAAB73icbVBNSwMxEJ2tX7V+VT16CRbBU9nVgnorevFYwbWVdi3ZNNuGJtklyQp12V/hxYOKV/+ON/+N6cdBWx8MPN6bYWZemHCmjet+O4Wl5ZXVteJ6aWNza3unvLt3p+NUEeqTmMeqFWJNOZPUN8xw2koUxSLktBkOr8Z+85EqzWJ5a0YJDQTuSxYxgo2V7p+6WS1/yE7zbrniVt0J0CLxZqQCMzS65a9OLyapoNIQjrVue25iggwrwwineamTappgMsR92rZUYkF1kE0OztGRVXooipUtadBE/T2RYaH1SIS2U2Az0PPeWPzPa6cmOg8yJpPUUEmmi6KUIxOj8feoxxQlho8swUQxeysiA6wwMTajkg3Bm395kfgn1Yuqe1Or1C9naRThAA7hGDw4gzpcQwN8ICDgGV7hzVHOi/PufExbC85sZh/+wPn8ATS/kDU=</latexit> <latexit sha1_base64="Ddk1wsUUpdGdorNP0fQBk4UuLfw=">AAAB73icbVBNSwMxEJ2tX7V+VT16CRbBU9nVgnorevFYwbWVdi3ZNNuGJtklyQp12V/hxYOKV/+ON/+N6cdBWx8MPN6bYWZemHCmjet+O4Wl5ZXVteJ6aWNza3unvLt3p+NUEeqTmMeqFWJNOZPUN8xw2koUxSLktBkOr8Z+85EqzWJ5a0YJDQTuSxYxgo2V7p+6WS1/yE7zbrniVt0J0CLxZqQCMzS65a9OLyapoNIQjrVue25iggwrwwineamTappgMsR92rZUYkF1kE0OztGRVXooipUtadBE/T2RYaH1SIS2U2Az0PPeWPzPa6cmOg8yJpPUUEmmi6KUIxOj8feoxxQlho8swUQxeysiA6wwMTajkg3Bm395kfgn1Yuqe1Or1C9naRThAA7hGDw4gzpcQwN8ICDgGV7hzVHOi/PufExbC85sZh/+wPn8ATS/kDU=</latexit> <latexit sha1_base64="Ddk1wsUUpdGdorNP0fQBk4UuLfw=">AAAB73icbVBNSwMxEJ2tX7V+VT16CRbBU9nVgnorevFYwbWVdi3ZNNuGJtklyQp12V/hxYOKV/+ON/+N6cdBWx8MPN6bYWZemHCmjet+O4Wl5ZXVteJ6aWNza3unvLt3p+NUEeqTmMeqFWJNOZPUN8xw2koUxSLktBkOr8Z+85EqzWJ5a0YJDQTuSxYxgo2V7p+6WS1/yE7zbrniVt0J0CLxZqQCMzS65a9OLyapoNIQjrVue25iggwrwwineamTappgMsR92rZUYkF1kE0OztGRVXooipUtadBE/T2RYaH1SIS2U2Az0PPeWPzPa6cmOg8yJpPUUEmmi6KUIxOj8feoxxQlho8swUQxeysiA6wwMTajkg3Bm395kfgn1Yuqe1Or1C9naRThAA7hGDw4gzpcQwN8ICDgGV7hzVHOi/PufExbC85sZh/+wPn8ATS/kDU=</latexit> z 3 3 z 3 2 z 3 1 <latexit sha1_base64="7Aw8cYKSlJp93OrDHKWtDVCUlt8=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokK6q3oxWMFYyttLJvtpl26uwm7G6GG/AovHlS8+ne8+W/ctjlo64OBx3szzMwLE860cd1vZ2FxaXlltbRWXt/Y3Nqu7Oze6ThVhPok5rFqhVhTziT1DTOcthJFsQg5bYbDq7HffKRKs1jemlFCA4H7kkWMYGOl+6du5uUP2UnerVTdmjsBmideQapQoNGtfHV6MUkFlYZwrHXbcxMTZFgZRjjNy51U0wSTIe7TtqUSC6qDbHJwjg6t0kNRrGxJgybq74kMC61HIrSdApuBnvXG4n9eOzXReZAxmaSGSjJdFKUcmRiNv0c9pigx fGQJJorZWxEZYIWJsRmVbQje7MvzxD+uXdTcm9Nq/bJIowT7cABH4MEZ1OEaGuADAQHP8ApvjnJenHfnY9q64BQze/AHzucPMCeQMg==</latexit> <latexit sha1_base64="7Aw8cYKSlJp93OrDHKWtDVCUlt8=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokK6q3oxWMFYyttLJvtpl26uwm7G6GG/AovHlS8+ne8+W/ctjlo64OBx3szzMwLE860cd1vZ2FxaXlltbRWXt/Y3Nqu7Oze6ThVhPok5rFqhVhTziT1DTOcthJFsQg5bYbDq7HffKRKs1jemlFCA4H7kkWMYGOl+6du5uUP2UnerVTdmjsBmideQapQoNGtfHV6MUkFlYZwrHXbcxMTZFgZRjjNy51U0wSTIe7TtqUSC6qDbHJwjg6t0kNRrGxJgybq74kMC61HIrSdApuBnvXG4n9eOzXReZAxmaSGSjJdFKUcmRiNv0c9pigx fGQJJorZWxEZYIWJsRmVbQje7MvzxD+uXdTcm9Nq/bJIowT7cABH4MEZ1OEaGuADAQHP8ApvjnJenHfnY9q64BQze/AHzucPMCeQMg==</latexit> <latexit sha1_base64="7Aw8cYKSlJp93OrDHKWtDVCUlt8=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KokK6q3oxWMFYyttLJvtpl26uwm7G6GG/AovHlS8+ne8+W/ctjlo64OBx3szzMwLE860cd1vZ2FxaXlltbRWXt/Y3Nqu7Oze6ThVhPok5rFqhVhTziT1DTOcthJFsQg5bYbDq7HffKRKs1jemlFCA4H7kkWMYGOl+6du5uUP2UnerVTdmjsBmideQapQoNGtfHV6MUkFlYZwrHXbcxMTZFgZRjjNy51U0wSTIe7TtqUSC6qDbHJwjg6t0kNRrGxJgybq74kMC61HIrSdApuBnvXG4n9eOzXReZAxmaSGSjJdFKUcmRiNv0c9pigx fGQJJorZWxEZYIWJsRmVbQje7MvzxD+uXdTcm9Nq/bJIowT7cABH4MEZ1OEaGuADAQHP8ApvjnJenHfnY9q64BQze/AHzucPMCeQMg==</latexit> z 2 1 z 2 2 <latexit sha1_base64="cvGdtb9JaDbWIrCRUjrDkfBuInM=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiewSE/RG9OIRE1cwsJJu6UJD2920XRPc8Cu8eFDj1b/jzX9jgT0o+JJJXt6bycy8MOFMG9f9dgorq2vrG8XN0tb2zu5eef/gTsepItQnMY9VO8Saciap b5jhtJ0oikXIaSscXU391iNVmsXy1owTGgg8kCxiBBsr3T/1strkwVavXHGr7gxomXg5qUCOZq/81e3HJBVUGsKx1h3PTUyQYWUY4XRS6qaaJpiM8IB2LJVYUB1ks4Mn6MQqfRTFypY0aKb+nsiw0HosQtspsBnqRW8q/ud1UhOdBxmTSWqoJPNFUcqRidH0e9RnihLDx5Zgopi9FZEhVpgYm1HJhuAtvrxM/Fr1ourenFUal3kaRTiCYzgFD+rQgGtogg8EBDzDK7w5ynlx3p2PeWvByWcO4Q+czx8wK5Ay</latexit> <latexit sha1_base64="cvGdtb9JaDbWIrCRUjrDkfBuInM=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiewSE/RG9OIRE1cwsJJu6UJD2920XRPc8Cu8eFDj1b/jzX9jgT0o+JJJXt6bycy8MOFMG9f9dgorq2vrG8XN0tb2zu5eef/gTsepItQnMY9VO8Saciap b5jhtJ0oikXIaSscXU391iNVmsXy1owTGgg8kCxiBBsr3T/1strkwVavXHGr7gxomXg5qUCOZq/81e3HJBVUGsKx1h3PTUyQYWUY4XRS6qaaJpiM8IB2LJVYUB1ks4Mn6MQqfRTFypY0aKb+nsiw0HosQtspsBnqRW8q/ud1UhOdBxmTSWqoJPNFUcqRidH0e9RnihLDx5Zgopi9FZEhVpgYm1HJhuAtvrxM/Fr1ourenFUal3kaRTiCYzgFD+rQgGtogg8EBDzDK7w5ynlx3p2PeWvByWcO4Q+czx8wK5Ay</latexit> <latexit sha1_base64="cvGdtb9JaDbWIrCRUjrDkfBuInM=">AAAB73icbVBNTwIxEJ3FL8Qv1KOXRmLiiewSE/RG9OIRE1cwsJJu6UJD2920XRPc8Cu8eFDj1b/jzX9jgT0o+JJJXt6bycy8MOFMG9f9dgorq2vrG8XN0tb2zu5eef/gTsepItQnMY9VO8Saciap b5jhtJ0oikXIaSscXU391iNVmsXy1owTGgg8kCxiBBsr3T/1strkwVavXHGr7gxomXg5qUCOZq/81e3HJBVUGsKx1h3PTUyQYWUY4XRS6qaaJpiM8IB2LJVYUB1ks4Mn6MQqfRTFypY0aKb+nsiw0HosQtspsBnqRW8q/ud1UhOdBxmTSWqoJPNFUcqRidH0e9RnihLDx5Zgopi9FZEhVpgYm1HJhuAtvrxM/Fr1ourenFUal3kaRTiCYzgFD+rQgGtogg8EBDzDK7w5ynlx3p2PeWvByWcO4Q+czx8wK5Ay</latexit> z 2 3 z 2 4 z 2 5 z 1 5 z 1 4 z 1 3 z 1 2 z 1 1 h 1 1 h 1 2 <latexit sha1_base64="wJ8P3xPwWyO745UfpNLw8GkBqRQ=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoN6KXjxWMLbSxrLZbtqlu5uwuxFKyK/w4kHFq3/Hm//GbZuDtj4YeLw3w8y8MOFMG9f9dkorq2vrG+XNytb2zu5edf/gXsepItQnMY9VJ8Saciap b5jhtJMoikXIaTscX0/99hNVmsXyzkwSGgg8lCxiBBsrPYz6WSN/zLy8X625dXcGtEy8gtSgQKtf/eoNYpIKKg3hWOuu5yYmyLAyjHCaV3qppgkmYzykXUslFlQH2ezgHJ1YZYCiWNmSBs3U3xMZFlpPRGg7BTYjvehNxf+8bmqiiyBjMkkNlWS+KEo5MjGafo8GTFFi+MQSTBSztyIywgoTYzOq2BC8xZeXid+oX9bd27Na86pIowxHcAyn4ME5NOEGWuADAQHP8ApvjnJenHfnY95acoqZQ/gD5/MHEuGQHw==</latexit> <latexit sha1_base64="wJ8P3xPwWyO745UfpNLw8GkBqRQ=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoN6KXjxWMLbSxrLZbtqlu5uwuxFKyK/w4kHFq3/Hm//GbZuDtj4YeLw3w8y8MOFMG9f9dkorq2vrG+XNytb2zu5edf/gXsepItQnMY9VJ8Saciap b5jhtJMoikXIaTscX0/99hNVmsXyzkwSGgg8lCxiBBsrPYz6WSN/zLy8X625dXcGtEy8gtSgQKtf/eoNYpIKKg3hWOuu5yYmyLAyjHCaV3qppgkmYzykXUslFlQH2ezgHJ1YZYCiWNmSBs3U3xMZFlpPRGg7BTYjvehNxf+8bmqiiyBjMkkNlWS+KEo5MjGafo8GTFFi+MQSTBSztyIywgoTYzOq2BC8xZeXid+oX9bd27Na86pIowxHcAyn4ME5NOEGWuADAQHP8ApvjnJenHfnY95acoqZQ/gD5/MHEuGQHw==</latexit> <latexit sha1_base64="wJ8P3xPwWyO745UfpNLw8GkBqRQ=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoN6KXjxWMLbSxrLZbtqlu5uwuxFKyK/w4kHFq3/Hm//GbZuDtj4YeLw3w8y8MOFMG9f9dkorq2vrG+XNytb2zu5edf/gXsepItQnMY9VJ8Saciap b5jhtJMoikXIaTscX0/99hNVmsXyzkwSGgg8lCxiBBsrPYz6WSN/zLy8X625dXcGtEy8gtSgQKtf/eoNYpIKKg3hWOuu5yYmyLAyjHCaV3qppgkmYzykXUslFlQH2ezgHJ1YZYCiWNmSBs3U3xMZFlpPRGg7BTYjvehNxf+8bmqiiyBjMkkNlWS+KEo5MjGafo8GTFFi+MQSTBSztyIywgoTYzOq2BC8xZeXid+oX9bd27Na86pIowxHcAyn4ME5NOEGWuADAQHP8ApvjnJenHfnY95acoqZQ/gD5/MHEuGQHw==</latexit> h 1 3 <latexit sha1_base64="IR79hVIp/X2tq9yv0nZSPPOlc1g=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0lUUG9FLx4rGFtpY9lsN+3S3U3Y3Qgl5Fd48aDi1b/jzX/jts1BWx8MPN6bYWZemHCmjet+O6Wl5ZXVtfJ6ZWNza3unurt3r+NUEeqTmMeqHWJNOZPU N8xw2k4UxSLktBWOrid+64kqzWJ5Z8YJDQQeSBYxgo2VHoa97DR/zLy8V625dXcKtEi8gtSgQLNX/er2Y5IKKg3hWOuO5yYmyLAyjHCaV7qppgkmIzygHUslFlQH2fTgHB1ZpY+iWNmSBk3V3xMZFlqPRWg7BTZDPe9NxP+8TmqiiyBjMkkNlWS2KEo5MjGafI/6TFFi+NgSTBSztyIyxAoTYzOq2BC8+ZcXiX9Sv6y7t2e1xlWRRhkO4BCOwYNzaMANNMEHAgKe4RXeHOW8OO/Ox6y15BQz+/AHzucPFGmQIA==</latexit> <latexit sha1_base64="IR79hVIp/X2tq9yv0nZSPPOlc1g=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0lUUG9FLx4rGFtpY9lsN+3S3U3Y3Qgl5Fd48aDi1b/jzX/jts1BWx8MPN6bYWZemHCmjet+O6Wl5ZXVtfJ6ZWNza3unurt3r+NUEeqTmMeqHWJNOZPU N8xw2k4UxSLktBWOrid+64kqzWJ5Z8YJDQQeSBYxgo2VHoa97DR/zLy8V625dXcKtEi8gtSgQLNX/er2Y5IKKg3hWOuO5yYmyLAyjHCaV7qppgkmIzygHUslFlQH2fTgHB1ZpY+iWNmSBk3V3xMZFlqPRWg7BTZDPe9NxP+8TmqiiyBjMkkNlWS2KEo5MjGafI/6TFFi+NgSTBSztyIyxAoTYzOq2BC8+ZcXiX9Sv6y7t2e1xlWRRhkO4BCOwYNzaMANNMEHAgKe4RXeHOW8OO/Ox6y15BQz+/AHzucPFGmQIA==</latexit> <latexit sha1_base64="IR79hVIp/X2tq9yv0nZSPPOlc1g=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0lUUG9FLx4rGFtpY9lsN+3S3U3Y3Qgl5Fd48aDi1b/jzX/jts1BWx8MPN6bYWZemHCmjet+O6Wl5ZXVtfJ6ZWNza3unurt3r+NUEeqTmMeqHWJNOZPU N8xw2k4UxSLktBWOrid+64kqzWJ5Z8YJDQQeSBYxgo2VHoa97DR/zLy8V625dXcKtEi8gtSgQLNX/er2Y5IKKg3hWOuO5yYmyLAyjHCaV7qppgkmIzygHUslFlQH2fTgHB1ZpY+iWNmSBk3V3xMZFlqPRWg7BTZDPe9NxP+8TmqiiyBjMkkNlWS2KEo5MjGafI/6TFFi+NgSTBSztyIyxAoTYzOq2BC8+ZcXiX9Sv6y7t2e1xlWRRhkO4BCOwYNzaMANNMEHAgKe4RXeHOW8OO/Ox6y15BQz+/AHzucPFGmQIA==</latexit> h 1 4 h 1 5 <latexit sha1_base64="bcn5ArE3vHbK1EJAOOANxhcLvGI=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0lEUW9FLx4rGFtpY9lsN+3S3U3Y3Qgl5Fd48aDi1b/jzX/jts1BWx8MPN6bYWZemHCmjet+O6Wl5ZXVtfJ6ZWNza3unurt3r+NUEeqTmMeqHWJNOZPU N8xw2k4UxSLktBWOrid+64kqzWJ5Z8YJDQQeSBYxgo2VHoa97Cx/zLy8V625dXcKtEi8gtSgQLNX/er2Y5IKKg3hWOuO5yYmyLAyjHCaV7qppgkmIzygHUslFlQH2fTgHB1ZpY+iWNmSBk3V3xMZFlqPRWg7BTZDPe9NxP+8TmqiiyBjMkkNlWS2KEo5MjGafI/6TFFi +NgSTBSztyIyxAoTYzOq2BC8+ZcXiX9Sv6y7t6e1xlWRRhkO4BCOwYNzaMANNMEHAgKe4RXeHOW8OO/Ox6y15BQz+/AHzucPF3mQIg==</latexit> <latexit sha1_base64="bcn5ArE3vHbK1EJAOOANxhcLvGI=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0lEUW9FLx4rGFtpY9lsN+3S3U3Y3Qgl5Fd48aDi1b/jzX/jts1BWx8MPN6bYWZemHCmjet+O6Wl5ZXVtfJ6ZWNza3unurt3r+NUEeqTmMeqHWJNOZPU N8xw2k4UxSLktBWOrid+64kqzWJ5Z8YJDQQeSBYxgo2VHoa97Cx/zLy8V625dXcKtEi8gtSgQLNX/er2Y5IKKg3hWOuO5yYmyLAyjHCaV7qppgkmIzygHUslFlQH2fTgHB1ZpY+iWNmSBk3V3xMZFlqPRWg7BTZDPe9NxP+8TmqiiyBjMkkNlWS2KEo5MjGafI/6TFFi +NgSTBSztyIyxAoTYzOq2BC8+ZcXiX9Sv6y7t6e1xlWRRhkO4BCOwYNzaMANNMEHAgKe4RXeHOW8OO/Ox6y15BQz+/AHzucPF3mQIg==</latexit> <latexit sha1_base64="bcn5ArE3vHbK1EJAOOANxhcLvGI=">AAAB73icbVBNS8NAEJ3Ur1q/qh69LBbBU0lEUW9FLx4rGFtpY9lsN+3S3U3Y3Qgl5Fd48aDi1b/jzX/jts1BWx8MPN6bYWZemHCmjet+O6Wl5ZXVtfJ6ZWNza3unurt3r+NUEeqTmMeqHWJNOZPU N8xw2k4UxSLktBWOrid+64kqzWJ5Z8YJDQQeSBYxgo2VHoa97Cx/zLy8V625dXcKtEi8gtSgQLNX/er2Y5IKKg3hWOuO5yYmyLAyjHCaV7qppgkmIzygHUslFlQH2fTgHB1ZpY+iWNmSBk3V3xMZFlqPRWg7BTZDPe9NxP+8TmqiiyBjMkkNlWS2KEo5MjGafI/6TFFi +NgSTBSztyIyxAoTYzOq2BC8+ZcXiX9Sv6y7t6e1xlWRRhkO4BCOwYNzaMANNMEHAgKe4RXeHOW8OO/Ox6y15BQz+/AHzucPF3mQIg==</latexit> h 2 1 h 2 3 h 2 5 h 3 1 h 3 5 x 1 1 x 1 2 <latexit sha1_base64="VrLPiSrAzYSUT45RPsMOAO+Li3w=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KkkR1FvRi8cKxlbaWDbbTbt0dxN2N2IJ+RVePKh49e9489+4bXPQ1gcDj/dmmJkXJpxp47rfztLyyuraemmjvLm1vbNb2du/03GqCPVJzGPVDrGmnEnq G2Y4bSeKYhFy2gpHVxO/9UiVZrG8NeOEBgIPJIsYwcZK90+9rJ4/ZF7eq1TdmjsFWiReQapQoNmrfHX7MUkFlYZwrHXHcxMTZFgZRjjNy91U0wSTER7QjqUSC6qDbHpwjo6t0kdRrGxJg6bq74kMC63HIrSdApuhnvcm4n9eJzXReZAxmaSGSjJbFKUcmRhNvkd9pigxfGwJJorZWxEZYoWJsRmVbQje/MuLxK/XLmruzWm1cVmkUYJDOIIT8OAMGnANTfCBgIBneIU3RzkvzrvzMWtdcoqZA/gD5/MHK5GQLw==</latexit> <latexit sha1_base64="VrLPiSrAzYSUT45RPsMOAO+Li3w=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KkkR1FvRi8cKxlbaWDbbTbt0dxN2N2IJ+RVePKh49e9489+4bXPQ1gcDj/dmmJkXJpxp47rfztLyyuraemmjvLm1vbNb2du/03GqCPVJzGPVDrGmnEnq G2Y4bSeKYhFy2gpHVxO/9UiVZrG8NeOEBgIPJIsYwcZK90+9rJ4/ZF7eq1TdmjsFWiReQapQoNmrfHX7MUkFlYZwrHXHcxMTZFgZRjjNy91U0wSTER7QjqUSC6qDbHpwjo6t0kdRrGxJg6bq74kMC63HIrSdApuhnvcm4n9eJzXReZAxmaSGSjJbFKUcmRhNvkd9pigxfGwJJorZWxEZYoWJsRmVbQje/MuLxK/XLmruzWm1cVmkUYJDOIIT8OAMGnANTfCBgIBneIU3RzkvzrvzMWtdcoqZA/gD5/MHK5GQLw==</latexit> <latexit sha1_base64="VrLPiSrAzYSUT45RPsMOAO+Li3w=">AAAB73icbVBNS8NAEJ34WetX1aOXxSJ4KkkR1FvRi8cKxlbaWDbbTbt0dxN2N2IJ+RVePKh49e9489+4bXPQ1gcDj/dmmJkXJpxp47rfztLyyuraemmjvLm1vbNb2du/03GqCPVJzGPVDrGmnEnq G2Y4bSeKYhFy2gpHVxO/9UiVZrG8NeOEBgIPJIsYwcZK90+9rJ4/ZF7eq1TdmjsFWiReQapQoNmrfHX7MUkFlYZwrHXHcxMTZFgZRjjNy91U0wSTER7QjqUSC6qDbHpwjo6t0kdRrGxJg6bq74kMC63HIrSdApuhnvcm4n9eJzXReZAxmaSGSjJbFKUcmRhNvkd9pigxfGwJJorZWxEZYoWJsRmVbQje/MuLxK/XLmruzWm1cVmkUYJDOIIT8OAMGnANTfCBgIBneIU3RzkvzrvzMWtdcoqZA/gD5/MHK5GQLw==</latexit> x 1 3 x 1 4 x 1 5 x 2 5 x 3 5 x 3 1 x 2 1 x 2 3 Figure 6.1: Generation model and structured inference network (with the filtering setting) of our proposed MR-HDMM for MR-MTS. The switches on incoming edges to a node (z l t ) are the same, which is shown as s l t in Figure 6.2. Notations Given a MR-MTS of L different sampling rates and length T, we use a vector x l t ∈R D l to represent the time series observations of lth rate at time t. Here l = 1,...,L, t = 1,...,T, and D l is the dimension of time series with lth rate. The L sampling rates are in descending order, i.e., l = 1 and l = L refer to the highest and lowest sampling rates. To make the notations succinct, we use xt :t 0 l :l 0 to denote all observed time series of lth to l 0 th rates and from time t to t 0 . We use θ (.) and φ (.) to denote the parameter sets for generation model p θ and inference network q φ respectively. we use L layers of RNNs in the inference network to model MR-MTS ofL different sampling rates. We useL HS , the number of hidden layers in both generation model and inference network, to control the depth of the learned hierarchical structures. In the rest of this paper, we take L HS =L for model simplicity, but in practice, they are not tied. The latent states or variables are denoted byz,s andh. Their superscript and subscript respectively indicate the corresponding layer(s) and the time step(s) (e.g.,z 1:L 1:T ,s 2:L t ,h l t ). 75 Figure 6.1 illustrates our MR-HDMM model which consists of the generation model and inference network. MR-HDMM captures the underlying data generation process by using the variational inference methods [46, 71] and learns the latent hierarchical structures usinglearnable switches andauxiliary connections toadaptively encode the dependencies across the hierarchies and the timestamps. In particular, the switches use an update-and-reuse mechanism to control the updates of the latent states of a layer based on their previous states (i.e., utilizing temporal information) andthelowerlatentlayers(i.e., utilizingthehierarchy). Theswitchtriggersanupdate of the current states if it gets enough information from lower-level states, otherwise, it reuses the previous states. Thus, the higher-level states act as summarized representations over the lower-level states, and the switches help to propagate the temporal dependencies. The auxiliary connections (dashed lines in Figure ??) between MR-MTS of different sampling rates and different latent layers help the model effectively capture the short-term and long-term temporal dependencies. Without the auxiliary connections, the higher-rate time series may mask the multi- scale dependencies present in the lower-rate time series data while propagating dependencies through bottom-up connections. Note that, the auxiliary connections are not related to the sampling rate of MR-MTS, and the sampling rate of the higher-rate variable need not be a multiple of the sampling rate of the lower-rate variable. Due to the flexibility of auxiliary connections, our MR-HDMM can also handle irregularly sampled time series data or missing data. We can a) zero-out the missing data points in the inference network and remove the corresponding auxiliary connections in the generation model during training, and b) interpolate missing values by adding auxiliary connections in the well-trained model. 76 0 1 Reuse Update 0 1 Update Update Update = 1 Reuse = 0 Figure 6.2: The switch mechanism for updating the latent states ztl in MR-HDMM. Left: The switch structure; Middle: Switch on (s l t = 1); Right: Switch off (s l t = 0). 6.3.1 Generation Model Figure 6.1 shows the generation model of our MR-HDMM. The generation process of our MR-HDMM follows the transition and emission framework, which is obtained by applying deep recurrent neural networks to non-linear continuous state-space models. The generation model is carefully designed to incorporate the switching mechanism and auxiliary connections in order to capture the multiple temporal dependencies present in MR-MTS. Transition We design the transition process of the latent state z to capture the hierarchical structure for multiple temporal dependencies with learnable binary switches s. For each non-bottom layer l > 1 and time step t ≥ 1, we use a binary switch state s l t to control the updates of the corresponding latent states z l t , as shown in Figure 6.2. s l t is obtained based on the values of the previous latent statesz l t−1 and the lower layer latent statesz l−1 t by a deterministic mapping s l t =I g θs (zt− 1l,z l−1 t )≥ 0 . When the switch is on (i.e., update operation,s l t = 1), z l t is updated based onz l t−1 andz l−1 t through a learnt transition distribution. We use a multivariate Gaussian distributionN tl, l t |z l t−1 ,z l−1 t ;θ z with mean and covariance given by ( l t , l t ) =g θz (z l t−1 ,z l−1 t ) as the transition distribution. When the switch is off (i.e., reuse operation, s l t = 0), z l t will be drawn from the same 77 distribution as its previous statesz l t−1 , which isN l t−1 , l t−1 . Note, unlike [10], we do not copy the previous state since our latent states are stochastic. The latent states of the first layer (z 1 1:T ) are always updated at each time step. In our model, g θs is parameterized by a multilayer perceptron (MLP), and g θz is parameterized by gated recurrent units (GRU) [11] to capture the temporal dependencies. With this update-or-reuse transition mechanism, higher latent layers tend to capture longer-term temporal dependencies through the bottom-up connections in the latent layers. Emission Multi-rate multivariate observationx needs to be generated fromz in the emission process. In order to embed the multiple temporal dependencies in the generated MR-MTS, we introduce auxiliary connections (denoted by the dashed lines in Figure 6.1) from the higher latent layers to the lower rate time series. That is, time series of lth rate at time t (i.e.,x l t ) is generated from all latent states up to lth layerz 1:l t through emission distribution Π x l t |z 1:l t ;θ x . The choice of emission distribution Π is flexible and depends on the data type. Multinomial distribution is used for categorical data, and Gaussian distribution is used for continuous data. Since all the data in our tasks are continuous, we use Gaussian distribution where the mean (x) l t and covariance (x) l t are determined by g θx (z 1:l t ), which is parameterized by an MLP. To summarize, the overall generation process is described in Algorithm 7.1. The parameter set of generation model is θ ={θ x ,θ z ,θ s }. Given this, the joint probability of MR-MTS and the latent states/switches can be factorized by the following Equation (6.1). 78 p θ x 1:L 1:T ,z 1:L 1:T ,s 2:L 1:T |z 1:L 0 =p θ x 1:L 1:T |z 1:L 1:T p θ z 1:L 1:T ,s 2:L 1:T |z 1:L 0 = T Y t=1 p θ x 1:L t |z 1:L t · T Y t=1 p θ z 1:L t ,s 2:L t |z 1:L t−1 = T Y t=1 L Y l=1 p θx x l t |z 1:l t · T Y t=1 p θz z 1 t |z 1 t−1 · T Y t=1 L Y l=2 p θs s l t |z l t−1 ,z l−1 t p θz z l t |z l t−1 ,z l−1 t ,s l t (6.1) In order to obtain the parameters of MR-HDMM, we need to maximize the log marginal likelihood of all MR-MTS data points, which is the summation of the log marginal likelihoodL(θ) = logp θ (x1 :T 1 :L|z01 :L) of each MR-MTS data point x1 :T 1 :L. The log marginal likelihood of one data point can be achieved by integrating out all possiblez ands in Equation (6.1). Sinces are deterministic binary variables, integrating them out can be done straightforwardly by taking their values in the likelihood. However, stochastic variablez cannot be analytically integrated out. Thus, we resort to the well-known variational principle [43] and introduce our inference network below. 6.3.2 Inference Network We design our inference network to mimic the structure of the generative model. The goal is to obtain an objective that can be optimized easily and which can make the model parameter learning amenable. Instead of directly maximizingL(θ) w.r.tθ, we build an inference network with a tractable distribution q φ , and maximize the variational evidence lower bound (ELBO)F(θ,φ)≤L(θ) with respect to both θ 79 Algorithm 6.1: Generation model of MR-HDMM 1: Initializez 1:L 0 ∼N (0,I) 2: for t = 1,...,T do 3: ( 1 t , 1 t ) =g θz (z 1 t−1 ) 4: z 1 t ∼N ( 1 t , 1 t ) {Transition of the first layer.} 5: for l = 2,··· ,L do 6: s l t =I g θs (z l t−1 ,z l−1 t )≥ 0 7: l t , l t = g θz (z l t−1 ,z l−1 t ) if s l t = 1 l t−1 , l t−1 otherwise. 8: z l t ∼N l t , l t {Transition of other layers.} 9: end for 10: for l = 1,··· ,L do 11: (x) l t , (x) l t =g θx (z 1:l t ) 12: x l t ∼N (x) l t , (x) l t Emission. 13: end for 14: end for and φ. Note, φ is the parameter set of the inference network which will is formally defined at the end of this section. The lower bound can be written as (please refer to the supplementary materials for full derivation): F(θ,φ) =E q φ h logp θ x 1:L 1:T |z 1:L 0:T i −D KL q φ z 1:L 1:T ,s 2:L 1:T |x 1:L 1:T ,z 1:L 0 p θ z 1:L 1:T ,s 2:L 1:T |z 1:L 0 (6.2) where the expectation of the first term is under q φ z 1:L 1:T |x 1:L 1:T ,z 1:L 0 . To get a tight bound and an accurate estimate from our MR-HDMM, we need to properly design a new inference network using the existing inference networks from SRNN [21] or DMM [48] is not applicable for MR-MTS. In the following, we show how we design the inference network (Figure 6.1) to obtain a well-structured approximation to the 80 posterior. First, we maintain the Markov properties ofz in the inference network, which leads to the factorization: q φ z1 :T1 :L,s 2:L 1:T |x 1:L 1:T ,z 1:L 0 = T Y t=1 q φ z 1:L t ,s 2:L t |z 1:L t−1 ,x 1:L 1:T (6.3) We then leverage the hierarchical structure and inherit the switches from the genera- tionmodelintotheinferencenetwork. Thatis, thesameg θs fromthegenerationmodel is used in the inference network, i.e., q φ s l t |z l t−1 ,z l−1 t ,x 1:L 1:T =q φs s l t |z l t−1 ,z l−1 t = p θs s l t |z l t−1 ,z l−1 t . Then, for each term in the righthand side of Equation (6.3) and for all t = 1,··· ,T, we have: q φ z 1:L t ,s 2:L t |z 1:L t−1 ,x1 :T1 :L =q φ z 1 t |z 1 t−1 ,x 1:L 1:T · L Y l=2 q φ s l t |z l t−1 ,z l−1 t ,x 1:L 1:T q φ z l t |z l t−1 ,z l−1 t ,s l t ,x 1:L 1:T =q φ z 1 t |z 1 t−1 ,x 1:L 1:T · L Y l=2 p θs s l t |z l t−1 ,z l−1 t q φ z l t |z l t−1 ,z l−1 t ,s l t ,x 1:L 1:T (6.4) Thus, the inference network can be factorized by Equation (6.3) and (6.4). Note, we also can factorize generative model based on Equation (6.1). Given these, we further factorize the ELBO in Equation (7.8) as a summation of expectations of conditional log likelihood and KL divergence terms over time steps and hierarchical layers: 81 F(θ,φ) = T X t=1 L X l=1 E Q ∗ (z 1:l t ) logp θx x l t |z 1:l t + T X t=1 E Q ∗ (z 1 t−1 ) D KL q φ z 1 t |x 1:L 1:T ,z 1 t−1 p θ z 1 t |z 1 t−1 + T X t=1 L X l=2 E Q ∗ (z 1 t−1 ,z l−1 t ) D KL q φ z l t |x 1:L 1:T ,z l t−1 ,z l−1 t p θ z 1 t |z 1 t−1 ,z l−1 t (6.5) whereQ ∗ (·) denotes the marginal distribution of (·) from q φ . The details about the factorization and the marginalized distribution are provided in the supplementary materials. Parameterization of inference network We parameterize the inference net- work and construct the variational approximation q φ z l t |z l t−1 ,z l−1 t ,s l t ,x 1:L 1:T used in Equation 6.5 by deep learning models. First, we use L RNNs to capture MR-MTS with L different sampling rates such that each rate is modeled by one RNN model separately. Second, to obtain lth latent statesz l t of the inference network at time step t, we not only use the previous latent states z l t−1 and the lower layer latent statesz l−1 t but also take the lth RNN output denoted byh l t as an input. Third, we reuse the same latent state distribution and switch mechanism from the generation model to generate z of the inference network. To be more specific, z l t is drawn from a multivariate normal distribution, where the mean and covariance are reused from those of z l t−1 if s l t = 1 and l > 1, otherwise the mean and covariance are modeled by gated recurrent units (GRU) with input h h l t ,z l t−1 ,z l−1 t i . The choice of the RNN models forh l t affects what and how the information at other time steps is considered in the approximation at time t, i.e. the form of q φ z l t |z l t−1 ,z l−1 t ,s l t ,x 1:L 1:T . 82 Inspired by [49], we construct the variational approximation in three settings (fil- tering, smoothing, bi-direction) for forecasting and interpolation tasks. In filtering setting, we only consider the information up to time t (i.e., x 1:L 1:t ) using forward RNNs. By doing this, we haveh l t =h l t forward =RNN forward h l t−1 forward ,x l t , and thus q φ z l t |z l t−1 ,z l−1 t ,s l t ,x 1:L 1:T = q φ z l t |z l t−1 ,z l−1 t ,s l t ,x 1:L 1:t . The filtering setting does not use future information, so it is suitable for forecasting tasks at future time step t 0 > T. For interpolation tasks, we can use backward RNNs to utilize the infor- mation after time t (i.e.,x 1:L t:T ) withh l t =h l t backward =RNN backward h l t+1 backward ,x l t , or bi-directional RNNs to utilize information across all time steps, which is x 1:L 1:T , at any time t withh l t = h h l t forward ,h l t backward i . These two models lead to smoothing and bidirection settings, respectively. We use φ h and φ z to denote the parameter sets related toh andz respectively and use φ ={φ h ,φ z ,φ s =θ s } to represent the parameter set of the inference network. 6.3.3 Learning the Parameters We jointly learn the parameters (θ, φ) of the generative model p θ and the inference networkq φ by maximizing the ELBO in Equation (6.5). The main challenge in the optimization is obtaining the gradients of all the terms under the correct expectation i.e,E Q ∗. We use stochastic backpropagation [46] for estimating all these gradients and train the model by stochastic gradient descent (SGD) approaches. We employ ancestral sampling techniques to obtain the samplesz. That is, we draw all samplesz in a sequential way from time 1 to T and from layer 1 to L. Given the samples from the previous layerl−1 or previous timet−1, the new samples at timet and layerl will be distributed according to the marginal distributionQ ∗ . Notice that alltermsofD KL q φ z l t |· p θ z l t |· inEquation(6.5)areKLdivergencesbetweentwo 83 Algorithm 6.2: Learning MR-HDMM with stochastic backpropagation and SGD Input:X: a set of MR-MTS of L sampling rates; Initial (θ,φ) 1: while not converged do 2: Choose a random minibatch of MR-MTSX 0 ⊂X 3: for each samplex 1:L 1:T ∈X 0 do 4: Computeh 1:L 1:T by inference network φ h on inputx 1:L 1:T 5: Sample d z 1:L 0 ∼N (0,I) 6: for t = 1,··· ,T do 7: Estimate 1 t (φ) , 1 t (φ) by φ z , and 1 t , 1 t by θ z , given samples \ zt− 11 and h 1 t 8: Based on 1 t (φ) , 1 t (φ) , 1 t , 1 t , compute the gradient of D KL q φ (zt1|·) p θ (z 1 t |·) 9: Sample c z 1 t ∼N 1 t (φ) , 1 t (φ) 10: for l = 2,··· ,L do 11: Compute s l t by θ s from samples \ zt− 1l and d z l−1 t 12: Estimate l t (φ) , l t (φ) by φ z , and l t , l t by θ z , given samples d z l t−1 , d z l−1 t , s l t , andh l t 13: Based on l t (φ) , l t (φ) , l t , l t , compute the gradient of D KL q φ z l t |· p θ z l t |· 14: Sample c z l t ∼N l t (φ) , l t (φ) 15: end for 16: Compute the gradient of logp θx x l t | d z 1:l t 17: end for 18: end for 19: Update (θ,φ) using all gradients 20: end while multivariate Gaussian distributions, and p θx x l t |z 1:l t is also a multivariate Gaussian distribution. Thus, all the required gradients can be estimated analytically from the samples drawn in our proposed way. Algorithm 7.3 shows the overall learning procedure. 84 6.4 Experiments We conducted experiments on two real-world datasets - the MIMIC-III health- care dataset and the USHCN climate dataset - and answer the following questions: (a) How does our proposed model perform when compared to the existing state-of- the-art approaches? (b) To what extent, are the proposed learnable hierarchical latent structure and auxiliary connections useful to model the data generation pro- cess? (c) How do we interpret the hierarchy learned by the proposed model? In the remainder of this section, we will describe the datasets, methods, empirical results, and interpretations to answer the above questions. 6.4.1 Datasets and Experimental Design MIMIC-III dataset MIMIC-III is a public de-identified dataset collected at Beth Israel Deaconess Medical Center from 2001 to 2012 [41]. It contains over 58,000 hospital admission records of 38,645 adults and 7,875 neonates. For our experiments, we chose 10,709 adult admission records and extracted 62 temporal features from the first 72 hours. These features had one of the three sampling rates of 1 hour, 4 hours, and 12 hours. To fill in any missing entries in our dataset we used forward or linear imputation similar to Che et al. [6]. To ensure a fair comparison, we only evaluate and compare all the models on the original time-series (i.e. non-imputed data). Our main tasks on the MIMIC-III dataset are forecasting on time series with all rates, and interpolation of the low-rate time series values. USHCN climate dataset The U.S. Historical Climatology Network Monthly (USHCN) dataset [64] is publicly available and consists of daily meteorological data 85 of 54 stations in California spanning from 1887 to 2009. It has five climate variables for each station: a) daily maximum temperature, b) daily minimum temperature, c) whether it was a snowy day or not, d) total daily precipitation, and e) daily snow precipitation. We preprocessed this dataset to extract daily climate data for 100 consecutive years starting from 1909. To get multi-rate time series data, we extract 208 features and split all features into 3 groups with sampling rates of 1 day, 5 days, and 10 days respectively. This public dataset has been carefully processed by National Oceanic and Atmospheric Administration (NOAA) to ensure quality control and it has no missing entries. Our tasks on this dataset are climate forecasting on all features and interpolation on 5-day and 10-day sampled data. Tasks We use the proposed MR-HDMM on two prediction tasks: multi-rate time series forecasting and low-rate time series interpolation. Since both datasets have 3 different sampling rates, we use HSR/MSR/LSR to denote high/medium/low sampling rates respectively. • Forecasting: Predict the future multivariate time series based on its history. For the MIMIC-III dataset, we predict the last 24 hrs time series based on the first (previous) 48 hours time series data. In the USHCN dataset, we forecast the climate for the next 30 days based on the observations of the previous year. • Interpolation: Fill-in the low rate time series based on co-evolving higher rate time series data. For the MIMIC-III dataset, we down-sampled 8 features from MSR to LSR and then performed an interpolation task by up-sampling these 8 features back to MSR. For the USHCN dataset, the interpolation task involved up-sampling the MSR and LSR features to HSR features, i.e. up-sample 5-day and 10-day data to 1-day. We demonstrate in-sample interpolation (i.e. interpolation within training dataset) and out-sample interpolation (i.e. interpolation in the testing 86 dataset) on the MIMIC-III dataset and in-sample interpolation on the USHCN dataset. Baselines We compare MR-HDMM with several strong baselines in these two tasks. Additionally, to show the advantage of learnable hierarchical latent structure and auxiliary connections, we simplify MR-HDMM into two other models for comparison: (a) Multi-Rate Deep Markov Models (MR-DMM) which removes the hierarchical structure in latent space; (b) Hierarchical Deep Markov Models (HDMM) which drops the auxiliary connections between the lower-rate time series and higher level latent layers. MR-DMM and HDMM are discussed in the supplementary materials. For forecasting tasks, we compare MR-HDMM with the following baseline models: • Single-rate: Kalman Filters (KF), Vector Auto-Regression (VAR), Long-Short Term Memory (LSTM) [32], Phased-LSTM (PLSTM) [68], Deep Markov Mod- els (DMM) [48] and Hierarchical Multiscale Recurrent Neural Networks (HM- RNN) [10]. • Multi-rate: Multiple Kalman Filters (MKF) [16], Multi-rate Kalman Filters (MR- KF) [83], Multi-Rate Deep Markov Models (MR-DMM) and Hierarchical Deep Markov Models (HDMM). For interpolation task, we compare MR-HDMM with the following baseline models: • Imputation methods: Mean imputation (Simple-Mean), Cubic Spline (Cubic- Spline) [14], Multiple Imputations by Chained Equations (MICE) [97], Miss- Forest [91], SoftImpute [61]. 87 • Deep learning models: Deep Markov Models (DMM), Multi-Rate Deep Markov Models (MR-DMM) and Hierarchical Deep Markov Models (HDMM). 6.4.2 Quantitative Results We show the evaluation results of our MR-HDMM on the following: (a) Forecasting: wegeneratethenextlatentstateusingthelearnedtransitiondistribution and then generate observations from these new latent states; (b) Interpolation: we use the mode of the approximated posterior in the generation model to generate the unseen data in low-rate time series. (c) Inference: we take multi-rate time series as the input to obtain the approximate posterior of latent states. For generation model in MR-HDMM, we use multivariate Gaussian with diag- onal covariance for both emission distribution and transition distribution. We parameterized the emission mappingg θx by a 3-layer MLP with ReLU activations, the transition mapping g θz by gated recurrent unit (GRU), and mapping g θs by a 3-layer MLP with ReLU activations on the hidden layers and linear activations on the output layer. For inference networks, we adopt filtering setting for forecasting and bidirection setting for interpolation from Table ?? with 3-layer GRUs. To update θ s , we replace the sign function with a sharp sigmoid function during training and use the indicator function during validation. The single-rate baseline models cannot handle multi-rate data directly, and we up-sample all the lower rate data into higher rate data using linear interpolation. We use the stats-toolbox [85] in python for the VAR model implementation. We use pykalman [17] to implement all the KF-based models. The implementation details of the KF-based methods are discussed in the supplementary materials. For LSTM and PLSTM model, we use one layer with 100 neurons to 88 model the time series and then apply a soft-max regressor on top of the last hidden state to do regression. To ensure a fair comparison, we use roughly the same amount of parameters for all models. For experiments on the USHCN dataset, train/valid/test sets were split as 70/10/20. For experiments on MIMIC-III, we used 5-fold cross-validation (train on 3 folds, validate on another fold, and test on the remaining fold) and report the average Mean Squared Error (MSE) of 5 runs for both forecasting and interpolation tasks. Note that, we train all the deep learning models with the Adam optimization method [45] and use a validation set to find the best weights, and report the results on the held-out test set. All the input variables are normalized to be of 0 mean and 1 standard deviation. 6.4.3 Quantitative Results 0 10 20 30 40 48 Hours 3 2 0 50 100 150 200 250 300 350 Days 3 2 Figure 6.3: Interpretable latent space learned by MR-HDMM model. (Upper) Hierarchical structure captured in the first 48 hours of an admission in MIMIC-III dataset by switch states of MR-HDMM. (Bottom) Hierarchical structure (red & blue blocks) captured along with precipitation time series (green curve) in the one-year observation in USHCN dataset by switch states of MR-HDMM. Forecasting Table 6.1 and 6.2 respectively show the forecasting results on MIMIC- III and USHCN datasets in terms of MSE. Our proposed MR-HDMM outperforms all the competing multi-rate latent space models by at least 5% and beats the single-rate 89 models by at least 15% on both datasets with all features. Our model also performs the best on single-rate HSR and MSR forecasting tasks and performs well on the LSR forecasting task on MIMIC-III and USHCN datasets. Table 6.1: Forecasting results (MSE) on MIMIC-III. All HSR MSR LSR KF 1.91×10 18 3.34×10 18 8.38×10 9 1.22×10 6 VAR 1.233 1.735 0.779 0.802 DMM 1.530 1.875 1.064 1.070 HM-RNN 1.388 1.846 0.904 0.713 LSTM 1.512 1.876 1.006 1.036 PLSTM 1.244 1.392 1.030 1.056 MKF 2.05×10 18 3.58×10 18 3.63×10 4 9.54×10 2 MR-KF 1.691 2.289 0.944 0.860 MR-DMM 1.061 1.192 0.723 1.065 HDMM 1.047 1.168 0.702 1.076 MR-HDMM 0.996 1.148 0.678 0.911 Table 6.2: Forecasting results (MSE) on USHCN. All HSR MSR LSR KF 1.236 1.254 1.190 1.148 VAR 2.415 2.579 1.921 1.748 DMM 0.795 0.608 0.903 0.877 HM-RNN 0.692 0.594 1.151 0.775 LSTM 0.849 0.688 0.934 0.928 PLSTM 0.813 0.710 0.870 0.915 MKF 1.212 1.082 1.727 1.518 MR-KF 0.628 0.542 0.986 0.799 MR-DMM 0.667 0.611 0.847 0.875 HDMM 0.626 0.568 0.815 0.836 MR-HDMM 0.591 0.541 0.742 0.795 Interpolation Table 6.3 shows the interpolation results on the two datasets. Since VAR and LSTM cannot be directly used for the interpolation task, we focus on evaluating generative models and imputation methods. From Table 6.3, we observe that our proposed model outperforms the baselines and the competing multi-rate 90 Table 6.3: Interpolation results (MSE) on MIMIC-III and USHCN. MIMIC-III USHCN In-sample Out-sample In-sample Simple-Mean 3.812 3.123 0.987 CubicSpline 3.713 3.212×10 4 0.947 MICE 3.747 7.580×10 2 0.670 MissForest 3.863 3.027 0.941 SoftImpute 3.715 3.086 0.759 DMM 3.714 3.027 0.782 MR-DMM 3.710 3.021 0.696 HDMM 3.790 3.100 0.750 MR-HDMM 3.582 2.921 0.626 latent space models by a large margin on all the interpolation tasks on these two datasets. 6.4.4 Discussion In all our experiments, MR-HDMM outperforms other generative models by a significant margin. Considering that all the deep generative models have the same amount of parameters, this improvement empirically demonstrates the effectiveness of our proposed learnable latent hierarchical structure and auxiliary connections. In Figure 6.3, we visualize the latent hierarchical structure of MR-HDMM learned from the first 48 hours of admission in the MIMIC-III dataset and one-year climate observations in the USHCN dataset. A color block indicates that the latent statez l t is updated fromz l t−1 andz l−1 t (update), while the white block indicatesz l t is generated from the same distribution ofz l t−1 (reuse). As expected, the higher latent layers tend to update less frequently and capture the long-term temporal dependencies. To understand learned hierarchical structure more intuitively, we also show precipitation time series from the USCHN dataset along with learned switches in Figure 6.3. We 91 observe that the higher latent layer tends to update along with the precipitation, which is reasonable since precipitation makes significant changes to the underlying weather condition which is captured by the higher latent layer. 6.5 Summary We proposed the Multi-Rate Hierarchical Deep Markov Model (MR-HDMM) - a novel deep generative model for forecasting and interpolation tasks on multi-rate multivariate time series (MR-MTS) data. MR-HDMM models the data generation process by learning a latent hierarchical structure using auxiliary connections and learnable switches to capture the temporal dependencies. Experiments show that the proposed model has competitive results in both forecasting and counterfactual inference in both healthcare and climate datasets. 92 Chapter 7 Continuous-time Time Series Counterfactual Inference with Augmented Differential Equations In this chapter, we focus on the problem of time series counterfactual inference with hidden confounders - unobserved variables that affect intervention assignment. For counterfactual inference models that assume there is no hidden confounder, the presence of hidden confounders leads to biased prediction. [4] Here we propose a controlled differential equation based model to handle hidden confounder problems in time series counterfactual inference. There are three main advantages of the proposed model • With augmented observational space and latent state assumption, the proposed model can perform time-series counterfactual inference with lower bias with the cost of higher variance. • With on neural network parametrized controlled differential equations, the proposed model treats time series as a continuous-time process and thus can naturally handle irregularly-spaced observation and missing values. 93 • Further, the proposed model provides a flexible framework to incorporate domain knowledge of hidden confounders, leading to more accurate and inter- pretable counterfactual inference. We have evaluated the proposed methods on a tumor growth simulation (time series of tumor size) under various interventions of chemotherapy and radiotherapy. The inference results show the purposed model outperforms competitive baselines that do not consider hidden confounders. In the next steps, we are going to test the proposed method in a real-world situation - medical condition progress of sepsis patients under the various treatment plans. Beyond purely data-driven mode, we will further develop the proposed method to incorporate domain knowledge of hidden confounders for counterfactual inference. 7.1 A Motivating Example To show why the presence of hidden confounders introduces bias for counterfac- tual inference, let us consider treatment effects estimation of cancer patients. Usually, there are multiple prescribed treatments including chemotherapy and radiotherapy for each patient. The treatment plan will be adjusted based on changes in tumor characteristics, drug resistance, and toxicity levels of drugs, etc. [50, 95]. However, some factors such as drug resistance may not be recorded in the electronic health records, or not be measured in practice. In these cases, estimating chemotherapy effects on the cancer progression by data-driven methods without accounting for the drug resistance will lead to biased results. 94 7.2 Related Work The problem of hidden confounders was first studied in the static setting. Wang and Blei [96] developed a theory for adjusting the bias introduced by the presence of hidden confounders in the observational data. They found out that the dependencies in these multiple confounders can be used to infer latent variables and act as substitutes for the hidden confounders. In this proposal, we are interested in estimating hidden confounders in a time series setting which is much more complicated than in the static setting. Not only because the hidden confounders may evolve, but also because they may be affected by previous interventions. On the other hand, most existing work on time series counterfactual inference including Counterfactual Gaussian Process (CGP) [84] and Recurrent Marginal Structural Networks (R-MSNs) [54] assume there are not hidden confounders, i.e. all variables affecting the intervention plan and the potential outcomes are observed, which is not testable in practice and not true in many cases. Recently, [4] drew the main idea in [96] and studies the deconfounder of time series. However, their proposed method is based on recurrent neural networks which works with discrete and regular timesteps and only supports one-step ahead inference. 7.3 Problem Formulation Consider a multivariate time series{X s : s≤ t} in which X s ∈ R m is the random variable of time-dependent covariates, sequence of interventions{A s :s≤t} in which A s ∈R v belongs to one of possible interventions. The observational data consists of N realizations above mentioned random variables{T i } N i=1 in which each 95 T i ={x i s , a i s , y i s :s≤t}. Given that N realizations are independent to each other, we omit the superscript i in following part for simplicity. Let x ≤t ={x s : s≤ t} be the history of covariates and a <t ={a s : s < t} be the history of interventions up to time t. Let a ≥t ={a s : s≥ t} be the future interventions. We would like to infer the potential outcome under future interventions given all historical information for any potential intervention plan a ≥t : p(X(a ≥t )|a <t , x ≤t ) (7.1) On the other hand, we can fit a regression model to estimate p(X|a ≥t , a <t , x ≤t ) from observational data. For cases without hidden confounders, this lead to unbiased estimationofpotentialoutcomep(X(a ≥t )|a <t , x ≤t ) =p(X|a ≥t , a <t , x ≤t )undercertain assumptions including sequential strong ignorability. X(A ≥t )⊥⊥ A t |a <t , x ≤t (7.2) for all possible intervention plan a ≥t . [20]. This condition holds if there are no hidden confounders but cannot be tested in practice, given that counterfactuals are never observed in practice. With the presence of hidden confounders, the above assumption is no longer valid and p(X|a ≥t , a <t , x ≤t )6=p(X(a ≥t )|a <t , x ≤t ) (7.3) Consequently, existing method by inferring conditional distributionp(X|a ≥t , a <t , x ≤t ) from observed data will lead to biased estimation of potential outcome p(X(a ≥t )|a <t , x ≤t ). 96 Latent States Time Series Data Time Auxiliary Variables Interventions x 1 <latexit sha1_base64="yncHiuXR449S5AoDxtuajI0EAm0=">AAAB9XicbVDLSgMxFL1TX7W+qi7dBIvgqsxUQZcFN66kgn1AO5ZMJm1DM8mQZNQy9D/cuFDErf/izr8x085CWw8EDufcyz05QcyZNq777RRWVtfWN4qbpa3tnd298v5BS8tEEdokkkvVCbCmnAnaNMxw2okVxVHAaTsYX2V++4EqzaS4M5OY+hEeCjZgBBsr3fcCycMIm1H6NO17/XLFrbozoGXi5aQCORr98lcvlCSJqDCEY627nhsbP8XKMMLptNRLNI0xGeMh7VoqcES1n85ST9GJVUI0kMo+YdBM/b2R4kjrSRTYySyhXvQy8T+vm5jBpZ8yESeGCjI/NEg4MhJlFaCQKUoMn1iCiWI2KyIjrDAxtqiSLcFb/PIyadWq3lm1dnteqd/kdRThCI7hFDy4gDpcQwOaQEDBM7zCm/PovDjvzsd8tODkO4fwB87nD8xmkr0=</latexit> Augmented Time Series t 1 <latexit sha1_base64="Ynqb8X5/kiOyHX7xonGGLyOZ6hM=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilB+x7/XLFrbpzkFXi5aQCORr98ldvELM04gqZpMZ0PTdBP6MaBZN8WuqlhieUjemQdy1VNOLGz+anTsmZVQYkjLUthWSu/p7IaGTMJApsZ0RxZJa9mfif100xvPYzoZIUuWKLRWEqCcZk9jcZCM0ZyokllGlhbyVsRDVlaNMp2RC85ZdXSatW9S6qtfvLSv0mj6MIJ3AK5+DBFdThDhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzifPwbmjaA=</latexit> t 2 <latexit sha1_base64="QkhZeJ8/5ytPGp7NPNhyJyw1LQM=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilB+zX+uWKW3XnIKvEy0kFcjT65a/eIGZpxBUySY3pem6CfkY1Cib5tNRLDU8oG9Mh71qqaMSNn81PnZIzqwxIGGtbCslc/T2R0ciYSRTYzojiyCx7M/E/r5tieO1nQiUpcsUWi8JUEozJ7G8yEJozlBNLKNPC3krYiGrK0KZTsiF4yy+vklat6l1Ua/eXlfpNHkcRTuAUzsGDK6jDHTSgCQyG8Ayv8OZI58V5dz4WrQUnnzmGP3A+fwAIao2h</latexit> t 3 <latexit sha1_base64="YA629ranIPUmZ/dNXTUQvIjXXco=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0laQY9FLx4r2lpoQ9lsN+3SzSbsToQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+O4W19Y3NreJ2aWd3b/+gfHjUNnGqGW+xWMa6E1DDpVC8hQIl7ySa0yiQ/DEY38z8xyeujYjVA04S7kd0qEQoGEUr3WO/3i9X3Ko7B1klXk4qkKPZL3/1BjFLI66QSWpM13MT9DOqUTDJp6VeanhC2ZgOeddSRSNu/Gx+6pScWWVAwljbUkjm6u+JjEbGTKLAdkYUR2bZm4n/ed0Uwys/EypJkSu2WBSmkmBMZn+TgdCcoZxYQpkW9lbCRlRThjadkg3BW355lbRrVa9erd1dVBrXeRxFOIFTOAcPLqEBt9CEFjAYwjO8wpsjnRfn3flYtBacfOYY/sD5/AEJ7o2i</latexit> t i <latexit sha1_base64="tfoZ0NwakpmHAHaUfZOrJ8E+VAM=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilB+yLfrniVt05yCrxclKBHI1++as3iFkacYVMUmO6npugn1GNgkk+LfVSwxPKxnTIu5YqGnHjZ/NTp+TMKgMSxtqWQjJXf09kNDJmEgW2M6I4MsveTPzP66YYXvuZUEmKXLHFojCVBGMy+5sMhOYM5cQSyrSwtxI2opoytOmUbAje8surpFWrehfV2v1lpX6Tx1GEEziFc/DgCupwBw1oAoMhPMMrvDnSeXHenY9Fa8HJZ47hD5zPH1vGjdg=</latexit> t N <latexit sha1_base64="dOuWbjJHTxXNPUGKVclvDBzUOro=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04kkq2g9oQ9lsN+3SzSbsToQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+Oyura+sbm4Wt4vbO7t5+6eCwaeJUM95gsYx1O6CGS6F4AwVK3k40p1EgeSsY3Uz91hPXRsTqEccJ9yM6UCIUjKKVHrB31yuV3Yo7A1kmXk7KkKPeK311+zFLI66QSWpMx3MT9DOqUTDJJ8VuanhC2YgOeMdSRSNu/Gx26oScWqVPwljbUkhm6u+JjEbGjKPAdkYUh2bRm4r/eZ0Uwys/EypJkSs2XxSmkmBMpn+TvtCcoRxbQpkW9lbChlRThjadog3BW3x5mTSrFe+8Ur2/KNeu8zgKcAwncAYeXEINbqEODWAwgGd4hTdHOi/Ou/Mxb11x8pkj+APn8wcy2o29</latexit> x 2 <latexit sha1_base64="vdTCQWpAcdEoAqjXndSIH2U27gw=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMOgo2l</latexit> x 3 <latexit sha1_base64="GcfdmiVXIuQAVIE+3vSRqlRiStc=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZtZICJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8AEAaNpg==</latexit> x i <latexit sha1_base64="HYbfjgGaRCmI8j+M0Errr+OmJEA=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120i7dbMLuRiyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6mfqtR1Sax/LBjBP0IzqQPOSMGivdP/V4r1R2K+4MZJl4OSlDjnqv9NXtxyyNUBomqNYdz02Mn1FlOBM4KXZTjQllIzrAjqWSRqj9bHbqhJxapU/CWNmShszU3xMZjbQeR4HtjKgZ6kVvKv7ndVITXvkZl0lqULL5ojAVxMRk+jfpc4XMiLEllClubyVsSBVlxqZTtCF4iy8vk2a14p1XqncX5dp1HkcBjuEEzsCDS6jBLdShAQwG8Ayv8OYI58V5dz7mrStOPnMEf+B8/gBh3o3c</latexit> x N <latexit sha1_base64="fMYXZM1NhtchxsMKVIG+qJHZYns=">AAAB6nicbVDLSgNBEOyNrxhfUY9eBoPgKexGQY9BL54konlAsoTZySQZMju7zPSKYcknePGgiFe/yJt/4yTZgyYWNBRV3XR3BbEUBl3328mtrK6tb+Q3C1vbO7t7xf2DhokSzXidRTLSrYAaLoXidRQoeSvWnIaB5M1gdD31m49cGxGpBxzH3A/pQIm+YBStdP/Uve0WS27ZnYEsEy8jJchQ6xa/Or2IJSFXyCQ1pu25Mfop1SiY5JNCJzE8pmxEB7xtqaIhN346O3VCTqzSI/1I21JIZurviZSGxozDwHaGFIdm0ZuK/3ntBPuXfipUnCBXbL6on0iCEZn+TXpCc4ZybAllWthbCRtSTRnadAo2BG/x5WXSqJS9s3Ll7rxUvcriyMMRHMMpeHABVbiBGtSBwQCe4RXeHOm8OO/Ox7w152Qzh/AHzucPOPKNwQ==</latexit> u N <latexit sha1_base64="6p5OgkgU1Nmgk2LeR/fwKZC7Qkc=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04kkq2g9oQ9lsJ+3SzSbsboQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IBFcG9f9dlZW19Y3Ngtbxe2d3b390sFhU8epYthgsYhVO6AaBZfYMNwIbCcKaRQIbAWjm6nfekKleSwfzThBP6IDyUPOqLHSQ9q765XKbsWdgSwTLydlyFHvlb66/ZilEUrDBNW647mJ8TOqDGcCJ8VuqjGhbEQH2LFU0gi1n81OnZBTq/RJGCtb0pCZ+nsio5HW4yiwnRE1Q73oTcX/vE5qwis/4zJJDUo2XxSmgpiYTP8mfa6QGTG2hDLF7a2EDamizNh0ijYEb/HlZdKsVrzzSvX+oly7zuMowDGcwBl4cAk1uIU6NIDBAJ7hFd4c4bw4787HvHXFyWeO4A+czx80YI2+</latexit> u i <latexit sha1_base64="KKDUQilCQPfQsyrs0GQPIYJdBfE=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0kPZ5v1xxq+4cZJV4OalAjka//NUbxCyNUBomqNZdz02Mn1FlOBM4LfVSjQllYzrErqWSRqj9bH7qlJxZZUDCWNmShszV3xMZjbSeRIHtjKgZ6WVvJv7ndVMTXvsZl0lqULLFojAVxMRk9jcZcIXMiIkllClubyVsRBVlxqZTsiF4yy+vklat6l1Ua/eXlfpNHkcRTuAUzsGDK6jDHTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwBdTI3Z</latexit> u 3 <latexit sha1_base64="DDWbuH+aNlcNaNhpX9SwMVfPK7s=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0laQY9FLx4r2lpoQ9lsJ+3SzSbsboQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IBFcG9f9dgpr6xubW8Xt0s7u3v5B+fCoreNUMWyxWMSqE1CNgktsGW4EdhKFNAoEPgbjm5n/+IRK81g+mEmCfkSHkoecUWOl+7Rf75crbtWdg6wSLycVyNHsl796g5ilEUrDBNW667mJ8TOqDGcCp6VeqjGhbEyH2LVU0gi1n81PnZIzqwxIGCtb0pC5+nsio5HWkyiwnRE1I73szcT/vG5qwis/4zJJDUq2WBSmgpiYzP4mA66QGTGxhDLF7a2EjaiizNh0SjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWsBgCM/wCm+OcF6cd+dj0Vpw8plj+APn8wcLdI2j</latexit> u 2 <latexit sha1_base64="3KK009f1vFrlgPwst9msQx+MaaY=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilh7Rf65crbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/dUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4bWfCZWkyBVbLApTSTAms7/JQGjOUE4soUwLeythI6opQ5tOyYbgLb+8Slq1qndRrd1fVuo3eRxFOIFTOAcPrqAOd9CAJjAYwjO8wpsjnRfn3flYtBacfOYY/sD5/AEJ8I2i</latexit> u 1 <latexit sha1_base64="owU612ss0kt/beWsTZ0WWxk+PTU=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0kPa9frniVt05yCrxclKBHI1++as3iFkaoTRMUK27npsYP6PKcCZwWuqlGhPKxnSIXUsljVD72fzUKTmzyoCEsbIlDZmrvycyGmk9iQLbGVEz0sveTPzP66YmvPYzLpPUoGSLRWEqiInJ7G8y4AqZERNLKFPc3krYiCrKjE2nZEPwll9eJa1a1buo1u4vK/WbPI4inMApnIMHV1CHO2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gAIbI2h</latexit> a(t) <latexit sha1_base64="/MZ12ksveNKBLeThEbFCFiIlVjQ=">AAAB63icbVBNS8NAEJ3Ur1q/qh69BItQLyWpgh6LXjxWsLXQhrLZbtqlu5uwOxFK6V/w4kERr/4hb/4bN20O2vpg4PHeDDPzwkRwg5737RTW1jc2t4rbpZ3dvf2D8uFR28SppqxFYxHrTkgME1yxFnIUrJNoRmQo2GM4vs38xyemDY/VA04SFkgyVDzilGAmkSqe98sVr+bN4a4SPycVyNHsl796g5imkimkghjT9b0EgynRyKlgs1IvNSwhdEyGrGupIpKZYDq/deaeWWXgRrG2pdCdq78npkQaM5Gh7ZQER2bZy8T/vG6K0XUw5SpJkSm6WBSlwsXYzR53B1wzimJiCaGa21tdOiKaULTxlGwI/vLLq6Rdr/kXtfr9ZaVxk8dRhBM4hSr4cAUNuIMmtIDCCJ7hFd4c6bw4787HorXg5DPH8AfO5w9f9o3M</latexit> z(t) <latexit sha1_base64="9Og6L8lp94Jdd27Ye9kI730mygY=">AAAB+HicbVDLSsNAFJ3UV62PRl26CRahbkpSBV0W3bisYB/QhjKZTNuhk5kwcyO0oV/ixoUibv0Ud/6NkzYLbT0wcDjnXu6ZE8ScaXDdb6uwsbm1vVPcLe3tHxyW7aPjtpaJIrRFJJeqG2BNORO0BQw47caK4ijgtBNM7jK/80SVZlI8wjSmfoRHgg0ZwWCkgV3uB5KHEYZxOptX4WJgV9yau4CzTrycVFCO5sD+6oeSJBEVQDjWuue5MfgpVsAIp/NSP9E0xmSCR7RnqMAR1X66CD53zo0SOkOpzBPgLNTfGymOtJ5GgZnMIupVLxP/83oJDG/8lIk4ASrI8tAw4Q5IJ2vBCZmiBPjUEEwUM1kdMsYKEzBdlUwJ3uqX10m7XvMua/WHq0rjNq+jiE7RGaoiD12jBrpHTdRCBCXoGb2iN2tmvVjv1sdytGDlOyfoD6zPH75skyM=</latexit> ¯ x i <latexit sha1_base64="puz+s2dYv6Uju5GBpH4AqJLw6yo=">AAAB8HicbVDLSgNBEOz1GeMr6tHLYBA8hd0o6DHoxWME85BkCbOT2WTIPJaZWTEs+QovHhTx6ud482+cJHvQxIKGoqqb7q4o4cxY3//2VlbX1jc2C1vF7Z3dvf3SwWHTqFQT2iCKK92OsKGcSdqwzHLaTjTFIuK0FY1upn7rkWrDlLy344SGAg8kixnB1kkP3Qjr7GnSY71S2a/4M6BlEuSkDDnqvdJXt69IKqi0hGNjOoGf2DDD2jLC6aTYTQ1NMBnhAe04KrGgJsxmB0/QqVP6KFbalbRopv6eyLAwZiwi1ymwHZpFbyr+53VSG1+FGZNJaqkk80VxypFVaPo96jNNieVjRzDRzN2KyBBrTKzLqOhCCBZfXibNaiU4r1TvLsq16zyOAhzDCZxBAJdQg1uoQwMICHiGV3jztPfivXsf89YVL585gj/wPn8AJMeQoQ==</latexit> h N <latexit sha1_base64="YbtmBRlboZ8IE3o5aJie73VbYQY=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04kkq2g9oQ9lsN+3SzSbsToQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+Oyura+sbm4Wt4vbO7t5+6eCwaeJUM95gsYx1O6CGS6F4AwVK3k40p1EgeSsY3Uz91hPXRsTqEccJ9yM6UCIUjKKVHoa9u16p7FbcGcgy8XJShhz1Xumr249ZGnGFTFJjOp6boJ9RjYJJPil2U8MTykZ0wDuWKhpx42ezUyfk1Cp9EsbalkIyU39PZDQyZhwFtjOiODSL3lT8z+ukGF75mVBJilyx+aIwlQRjMv2b9IXmDOXYEsq0sLcSNqSaMrTpFG0I3uLLy6RZrXjnler9Rbl2ncdRgGM4gTPw4BJqcAt1aACDATzDK7w50nlx3p2PeeuKk88cwR84nz8gko2x</latexit> h i <latexit sha1_base64="aZ/zXTULzJiM1n+5CRkawCLtx9Q=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0MOrzfrniVt05yCrxclKBHI1++as3iFkaoTRMUK27npsYP6PKcCZwWuqlGhPKxnSIXUsljVD72fzUKTmzyoCEsbIlDZmrvycyGmk9iQLbGVEz0sveTPzP66YmvPYzLpPUoGSLRWEqiInJ7G8y4AqZERNLKFPc3krYiCrKjE2nZEPwll9eJa1a1buo1u4vK/WbPI4inMApnIMHV1CHO2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gBJfo3M</latexit> h 2 <latexit sha1_base64="bp6t7iIR0UjPyg0dB3e9yeQtzcI=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0MOrX+uWKW3XnIKvEy0kFcjT65a/eIGZphNIwQbXuem5i/Iwqw5nAaamXakwoG9Mhdi2VNELtZ/NTp+TMKgMSxsqWNGSu/p7IaKT1JApsZ0TNSC97M/E/r5ua8NrPuExSg5ItFoWpICYms7/JgCtkRkwsoUxxeythI6ooMzadkg3BW355lbRqVe+iWru/rNRv8jiKcAKncA4eXEEd7qABTWAwhGd4hTdHOC/Ou/OxaC04+cwx/IHz+QP2E42V</latexit> h 3 <latexit sha1_base64="1rokIfWEaaDwRR1ewq684sFr0TQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0laQY9FLx4r2lpoQ9lsJ+3SzSbsboQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IBFcG9f9dgpr6xubW8Xt0s7u3v5B+fCoreNUMWyxWMSqE1CNgktsGW4EdhKFNAoEPgbjm5n/+IRK81g+mEmCfkSHkoecUWOl+1G/3i9X3Ko7B1klXk4qkKPZL3/1BjFLI5SGCap113MT42dUGc4ETku9VGNC2ZgOsWuppBFqP5ufOiVnVhmQMFa2pCFz9fdERiOtJ1FgOyNqRnrZm4n/ed3UhFd+xmWSGpRssShMBTExmf1NBlwhM2JiCWWK21sJG1FFmbHplGwI3vLLq6Rdq3r1au3uotK4zuMowgmcwjl4cAkNuIUmtIDBEJ7hFd4c4bw4787HorXg5DPH8AfO5w/3l42W</latexit> h 1 <latexit sha1_base64="yiZQYYYwO+F5pP+nrgz81eCgwJQ=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0MOp7/XLFrbpzkFXi5aQCORr98ldvELM0QmmYoFp3PTcxfkaV4UzgtNRLNSaUjekQu5ZKGqH2s/mpU3JmlQEJY2VLGjJXf09kNNJ6EgW2M6JmpJe9mfif101NeO1nXCapQckWi8JUEBOT2d9kwBUyIyaWUKa4vZWwEVWUGZtOyYbgLb+8Slq1qndRrd1fVuo3eRxFOIFTOAcPrqAOd9CAJjAYwjO8wpsjnBfn3flYtBacfOYY/sD5/AH0j42U</latexit> t 1 <latexit sha1_base64="Ynqb8X5/kiOyHX7xonGGLyOZ6hM=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilB+x7/XLFrbpzkFXi5aQCORr98ldvELM04gqZpMZ0PTdBP6MaBZN8WuqlhieUjemQdy1VNOLGz+anTsmZVQYkjLUthWSu/p7IaGTMJApsZ0RxZJa9mfif100xvPYzoZIUuWKLRWEqCcZk9jcZCM0ZyokllGlhbyVsRDVlaNMp2RC85ZdXSatW9S6qtfvLSv0mj6MIJ3AK5+DBFdThDhrQBAZDeIZXeHOk8+K8Ox+L1oKTzxzDHzifPwbmjaA=</latexit> t 2 <latexit sha1_base64="QkhZeJ8/5ytPGp7NPNhyJyw1LQM=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilB+zX+uWKW3XnIKvEy0kFcjT65a/eIGZpxBUySY3pem6CfkY1Cib5tNRLDU8oG9Mh71qqaMSNn81PnZIzqwxIGGtbCslc/T2R0ciYSRTYzojiyCx7M/E/r5tieO1nQiUpcsUWi8JUEozJ7G8yEJozlBNLKNPC3krYiGrK0KZTsiF4yy+vklat6l1Ua/eXlfpNHkcRTuAUzsGDK6jDHTSgCQyG8Ayv8OZI58V5dz4WrQUnnzmGP3A+fwAIao2h</latexit> t 3 <latexit sha1_base64="YA629ranIPUmZ/dNXTUQvIjXXco=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0laQY9FLx4r2lpoQ9lsN+3SzSbsToQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+O4W19Y3NreJ2aWd3b/+gfHjUNnGqGW+xWMa6E1DDpVC8hQIl7ySa0yiQ/DEY38z8xyeujYjVA04S7kd0qEQoGEUr3WO/3i9X3Ko7B1klXk4qkKPZL3/1BjFLI66QSWpM13MT9DOqUTDJp6VeanhC2ZgOeddSRSNu/Gx+6pScWWVAwljbUkjm6u+JjEbGTKLAdkYUR2bZm4n/ed0Uwys/EypJkSu2WBSmkmBMZn+TgdCcoZxYQpkW9lbCRlRThjadkg3BW355lbRrVa9erd1dVBrXeRxFOIFTOAcPLqEBt9CEFjAYwjO8wpsjnRfn3flYtBacfOYY/sD5/AEJ7o2i</latexit> t i <latexit sha1_base64="tfoZ0NwakpmHAHaUfZOrJ8E+VAM=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilB+yLfrniVt05yCrxclKBHI1++as3iFkacYVMUmO6npugn1GNgkk+LfVSwxPKxnTIu5YqGnHjZ/NTp+TMKgMSxtqWQjJXf09kNDJmEgW2M6I4MsveTPzP66YYXvuZUEmKXLHFojCVBGMy+5sMhOYM5cQSyrSwtxI2opoytOmUbAje8surpFWrehfV2v1lpX6Tx1GEEziFc/DgCupwBw1oAoMhPMMrvDnSeXHenY9Fa8HJZ47hD5zPH1vGjdg=</latexit> t N <latexit sha1_base64="dOuWbjJHTxXNPUGKVclvDBzUOro=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04kkq2g9oQ9lsN+3SzSbsToQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+Oyura+sbm4Wt4vbO7t5+6eCwaeJUM95gsYx1O6CGS6F4AwVK3k40p1EgeSsY3Uz91hPXRsTqEccJ9yM6UCIUjKKVHrB31yuV3Yo7A1kmXk7KkKPeK311+zFLI66QSWpMx3MT9DOqUTDJJ8VuanhC2YgOeMdSRSNu/Gx26oScWqVPwljbUkhm6u+JjEbGjKPAdkYUh2bRm4r/eZ0Uwys/EypJkSs2XxSmkmBMpn+TvtCcoRxbQpkW9lbChlRThjadog3BW3x5mTSrFe+8Ur2/KNeu8zgKcAwncAYeXEINbqEODWAwgGd4hTdHOi/Ou/Mxb11x8pkj+APn8wcy2o29</latexit> x 1 <latexit sha1_base64="yncHiuXR449S5AoDxtuajI0EAm0=">AAAB9XicbVDLSgMxFL1TX7W+qi7dBIvgqsxUQZcFN66kgn1AO5ZMJm1DM8mQZNQy9D/cuFDErf/izr8x085CWw8EDufcyz05QcyZNq777RRWVtfWN4qbpa3tnd298v5BS8tEEdokkkvVCbCmnAnaNMxw2okVxVHAaTsYX2V++4EqzaS4M5OY+hEeCjZgBBsr3fcCycMIm1H6NO17/XLFrbozoGXi5aQCORr98lcvlCSJqDCEY627nhsbP8XKMMLptNRLNI0xGeMh7VoqcES1n85ST9GJVUI0kMo+YdBM/b2R4kjrSRTYySyhXvQy8T+vm5jBpZ8yESeGCjI/NEg4MhJlFaCQKUoMn1iCiWI2KyIjrDAxtqiSLcFb/PIyadWq3lm1dnteqd/kdRThCI7hFDy4gDpcQwOaQEDBM7zCm/PovDjvzsd8tODkO4fwB87nD8xmkr0=</latexit> x 2 <latexit sha1_base64="vdTCQWpAcdEoAqjXndSIH2U27gw=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120y7dbMLuRCyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEikMuu63s7K6tr6xWdgqbu/s7u2XDg6bJk414w0Wy1i3A2q4FIo3UKDk7URzGgWSt4LRzdRvPXJtRKwecJxwP6IDJULBKFrp/qlX7ZXKbsWdgSwTLydlyFHvlb66/ZilEVfIJDWm47kJ+hnVKJjkk2I3NTyhbEQHvGOpohE3fjY7dUJOrdInYaxtKSQz9fdERiNjxlFgOyOKQ7PoTcX/vE6K4ZWfCZWkyBWbLwpTSTAm079JX2jOUI4toUwLeythQ6opQ5tO0YbgLb68TJrVindeqd5dlGvXeRwFOIYTOAMPLqEGt1CHBjAYwDO8wpsjnRfn3fmYt644+cwR/IHz+QMOgo2l</latexit> x 3 <latexit sha1_base64="GcfdmiVXIuQAVIE+3vSRqlRiStc=">AAAB6nicbVDLTgJBEOzFF+IL9ehlIjHxRHbBRI9ELx4xyiOBDZkdemHC7OxmZtZICJ/gxYPGePWLvPk3DrAHBSvppFLVne6uIBFcG9f9dnJr6xubW/ntws7u3v5B8fCoqeNUMWywWMSqHVCNgktsGG4EthOFNAoEtoLRzcxvPaLSPJYPZpygH9GB5CFn1Fjp/qlX7RVLbtmdg6wSLyMlyFDvFb+6/ZilEUrDBNW647mJ8SdUGc4ETgvdVGNC2YgOsGOppBFqfzI/dUrOrNInYaxsSUPm6u+JCY20HkeB7YyoGeplbyb+53VSE175Ey6T1KBki0VhKoiJyexv0ucKmRFjSyhT3N5K2JAqyoxNp2BD8JZfXiXNStmrlit3F6XadRZHHk7gFM7Bg0uowS3UoQEMBvAMr/DmCOfFeXc+Fq05J5s5hj9wPn8AEAaNpg==</latexit> x i <latexit sha1_base64="HYbfjgGaRCmI8j+M0Errr+OmJEA=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YA2lM120i7dbMLuRiyhP8GLB0W8+ou8+W/ctjlo64OBx3szzMwLEsG1cd1vZ2V1bX1js7BV3N7Z3dsvHRw2dZwqhg0Wi1i1A6pRcIkNw43AdqKQRoHAVjC6mfqtR1Sax/LBjBP0IzqQPOSMGivdP/V4r1R2K+4MZJl4OSlDjnqv9NXtxyyNUBomqNYdz02Mn1FlOBM4KXZTjQllIzrAjqWSRqj9bHbqhJxapU/CWNmShszU3xMZjbQeR4HtjKgZ6kVvKv7ndVITXvkZl0lqULL5ojAVxMRk+jfpc4XMiLEllClubyVsSBVlxqZTtCF4iy8vk2a14p1XqncX5dp1HkcBjuEEzsCDS6jBLdShAQwG8Ayv8OYI58V5dz7mrStOPnMEf+B8/gBh3o3c</latexit> x N <latexit sha1_base64="fMYXZM1NhtchxsMKVIG+qJHZYns=">AAAB6nicbVDLSgNBEOyNrxhfUY9eBoPgKexGQY9BL54konlAsoTZySQZMju7zPSKYcknePGgiFe/yJt/4yTZgyYWNBRV3XR3BbEUBl3328mtrK6tb+Q3C1vbO7t7xf2DhokSzXidRTLSrYAaLoXidRQoeSvWnIaB5M1gdD31m49cGxGpBxzH3A/pQIm+YBStdP/Uve0WS27ZnYEsEy8jJchQ6xa/Or2IJSFXyCQ1pu25Mfop1SiY5JNCJzE8pmxEB7xtqaIhN346O3VCTqzSI/1I21JIZurviZSGxozDwHaGFIdm0ZuK/3ntBPuXfipUnCBXbL6on0iCEZn+TXpCc4ZybAllWthbCRtSTRnadAo2BG/x5WXSqJS9s3Ll7rxUvcriyMMRHMMpeHABVbiBGtSBwQCe4RXeHOm8OO/Ox7w152Qzh/AHzucPOPKNwQ==</latexit> u N <latexit sha1_base64="6p5OgkgU1Nmgk2LeR/fwKZC7Qkc=">AAAB6nicbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04kkq2g9oQ9lsJ+3SzSbsboQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IBFcG9f9dlZW19Y3Ngtbxe2d3b390sFhU8epYthgsYhVO6AaBZfYMNwIbCcKaRQIbAWjm6nfekKleSwfzThBP6IDyUPOqLHSQ9q765XKbsWdgSwTLydlyFHvlb66/ZilEUrDBNW647mJ8TOqDGcCJ8VuqjGhbEQH2LFU0gi1n81OnZBTq/RJGCtb0pCZ+nsio5HW4yiwnRE1Q73oTcX/vE5qwis/4zJJDUo2XxSmgpiYTP8mfa6QGTG2hDLF7a2EDamizNh0ijYEb/HlZdKsVrzzSvX+oly7zuMowDGcwBl4cAk1uIU6NIDBAJ7hFd4c4bw4787HvHXFyWeO4A+czx80YI2+</latexit> u i <latexit sha1_base64="KKDUQilCQPfQsyrs0GQPIYJdBfE=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0kPZ5v1xxq+4cZJV4OalAjka//NUbxCyNUBomqNZdz02Mn1FlOBM4LfVSjQllYzrErqWSRqj9bH7qlJxZZUDCWNmShszV3xMZjbSeRIHtjKgZ6WVvJv7ndVMTXvsZl0lqULLFojAVxMRk9jcZcIXMiIkllClubyVsRBVlxqZTsiF4yy+vklat6l1Ua/eXlfpNHkcRTuAUzsGDK6jDHTSgCQyG8Ayv8OYI58V5dz4WrQUnnzmGP3A+fwBdTI3Z</latexit> u 3 <latexit sha1_base64="DDWbuH+aNlcNaNhpX9SwMVfPK7s=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0laQY9FLx4r2lpoQ9lsJ+3SzSbsboQS+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IBFcG9f9dgpr6xubW8Xt0s7u3v5B+fCoreNUMWyxWMSqE1CNgktsGW4EdhKFNAoEPgbjm5n/+IRK81g+mEmCfkSHkoecUWOl+7Rf75crbtWdg6wSLycVyNHsl796g5ilEUrDBNW667mJ8TOqDGcCp6VeqjGhbEyH2LVU0gi1n81PnZIzqwxIGCtb0pC5+nsio5HWkyiwnRE1I73szcT/vG5qwis/4zJJDUq2WBSmgpiYzP4mA66QGTGxhDLF7a2EjaiizNh0SjYEb/nlVdKuVb16tXZ3UWlc53EU4QRO4Rw8uIQG3EITWsBgCM/wCm+OcF6cd+dj0Vpw8plj+APn8wcLdI2j</latexit> u 2 <latexit sha1_base64="3KK009f1vFrlgPwst9msQx+MaaY=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2m3bpZhN2J0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmTjVjDdZLGPdCajhUijeRIGSdxLNaRRI3g7GtzO//cS1EbF6xEnC/YgOlQgFo2ilh7Rf65crbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/dUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4bWfCZWkyBVbLApTSTAms7/JQGjOUE4soUwLeythI6opQ5tOyYbgLb+8Slq1qndRrd1fVuo3eRxFOIFTOAcPrqAOd9CAJjAYwjO8wpsjnRfn3flYtBacfOYY/sD5/AEJ8I2i</latexit> u 1 <latexit sha1_base64="owU612ss0kt/beWsTZ0WWxk+PTU=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mqoMeiF48V7Qe0oWy2k3bpZhN2N0IJ/QlePCji1V/kzX/jts1BWx8MPN6bYWZekAiujet+O4W19Y3NreJ2aWd3b/+gfHjU0nGqGDZZLGLVCahGwSU2DTcCO4lCGgUC28H4dua3n1BpHstHM0nQj+hQ8pAzaqz0kPa9frniVt05yCrxclKBHI1++as3iFkaoTRMUK27npsYP6PKcCZwWuqlGhPKxnSIXUsljVD72fzUKTmzyoCEsbIlDZmrvycyGmk9iQLbGVEz0sveTPzP66YmvPYzLpPUoGSLRWEqiInJ7G8y4AqZERNLKFPc3krYiCrKjE2nZEPwll9eJa1a1buo1u4vK/WbPI4inMApnIMHV1CHO2hAExgM4Rle4c0Rzovz7nwsWgtOPnMMf+B8/gAIbI2h</latexit> μ z 0 <latexit sha1_base64="cuLM+F0jxfkcOwCi0NZH6F9IwbA=">AAAB8HicbVBNSwMxEJ2tX7V+VT16CRbBU9mtBT0WvXisYD+kXZZsmm1Dk+ySZIW69Fd48aCIV3+ON/+NabsHbX0w8Hhvhpl5YcKZNq777RTW1jc2t4rbpZ3dvf2D8uFRW8epIrRFYh6rbog15UzSlmGG026iKBYhp51wfDPzO49UaRbLezNJqC/wULKIEWys9NAXaZA9Be40KFfcqjsHWiVeTiqQoxmUv/qDmKSCSkM41rrnuYnxM6wMI5xOS/1U0wSTMR7SnqUSC6r9bH7wFJ1ZZYCiWNmSBs3V3xMZFlpPRGg7BTYjvezNxP+8XmqiKz9jMkkNlWSxKEo5MjGafY8GTFFi+MQSTBSztyIywgoTYzMq2RC85ZdXSbtW9S6qtbt6pXGdx1GEEziFc/DgEhpwC01oAQEBz/AKb45yXpx352PRWnDymWP4A+fzB+ONkHY=</latexit> ⌃ z 0 <latexit sha1_base64="HySeo0HYp/DYyQPGge7lcYUYYUg=">AAAB83icbVBNS8NAEJ34WetX1aOXxSJ4KkkV9Fj04rGi/YAmhM120y7d3YTdjVBD/4YXD4p49c9489+4bXPQ1gcDj/dmmJkXpZxp47rfzsrq2vrGZmmrvL2zu7dfOThs6yRThLZIwhPVjbCmnEnaMsxw2k0VxSLitBONbqZ+55EqzRL5YMYpDQQeSBYzgo2VfP+eDQQO86fQnYSVqltzZ0DLxCtIFQo0w8qX309IJqg0hGOte56bmiDHyjDC6aTsZ5qmmIzwgPYslVhQHeSzmyfo1Cp9FCfKljRopv6eyLHQeiwi2ymwGepFbyr+5/UyE18FOZNpZqgk80VxxpFJ0DQA1GeKEsPHlmCimL0VkSFWmBgbU9mG4C2+vEza9Zp3XqvfXVQb10UcJTiGEzgDDy6hAbfQhBYQSOEZXuHNyZwX5935mLeuOMXMEfyB8/kD+AyRow==</latexit> Figure 7.1: The ACODE model with CDE-RNN encoder and CDE decoder. The CDE-RNN encoder first runs backwards-in-time to produce an approximate posterior overtheinitiallatentstateq(z 0 |{x i ,t i } N i=1 , a ≤t ). Givenasampleof z 0 andintervention process a(t), we can generate latent state at any point of interest, and further generate augmented time series observations. 7.4 Augmented Counterfactual Ordinary Differ- ential Equations To address the problem, the key is to reduce inference bias caused by the presence of hidden confounders and capture underlying temporal dynamics and the intervention effects. We propose a two-step method, called augmented counterfactual ordinary differential equations (ACODE), which first lifts the time series into an augmented space with additional dimensions and then models the augmented time series with neural network parameterized counterfactual differential equations. 97 7.4.1 Augmented time series The proposed method first lifts the time series observations by introducing k auxiliary variables z t ∈R k into x t , resulting in augmented time series x t = xt zt . In this augmented space, we can safely assume p(x(a >t )|a ≤t ,{x i ,t i } N i=1 ) =p(x >t |a >t , a ≤t ,{x i ,t i } N i=1 ) (7.4) The insights of introducing auxiliary variables z t come from two aspects. The first one is centered around bias-variance trade-off. Modeling in the augmented space can reduce estimation bias at the cost of higher variance [75]. The second one is about the dimensionality of the space where underlying temporal dynamics work. We want to approximate the underlying temporal dynamics with learnable mappings. Given the presence of hidden confounders, the true temporal dynamics work in a space with higher dimensionality comparing with the space of time-series observations. Therefore, additional dimensions could make it easier to approximate the true temporal dynamics. In general, auxiliary variables z t serve purely as a mathematical component without interpretable mechanistic meaning and are initialized with all zero vectors. However, in some cases, the learned auxiliary variables z t can provide interpretable insights about hidden confounders. We will show this later in the experiment on tumor growth simulation. Further, we may also leverage the domain knowledge on hidden confounders and initialize hidden confounders based on their dependency with observed covariates and interventions. This is especially useful in fields where we have a mechanistic understanding of hidden confounders though they cannot be directly measured. For example, a patient’s cardiac contractility (the heart’s ability 98 to squeeze blood), stroke volume, or systemic vascular resistance are not unobserved but can be inferred with domain knowledge. 7.4.2 Latent Counterfactual Differential Equations Starting from augmented time series, the proposed method models the interven- tion effects with latent counterfactual differential equations. Specifically, we assume there are latent states z t representing the state of time series. Latent states evolves controlled by both the baseline progress and intervention effects. z(t) = z(t 0 ) + Z t t 0 f z (z(s);θ z )ds | {z } Baseline progress Z t t 0 f a (z(s), a(s);θ a )ds | {z } Intervention effects (7.5) x(t)∼p(x(t)|z(t)) (7.6) Equation 7.5 represents counterfactual differential equations (CDEs), where both f z and f a are neural networks parameterized by θ z and θ a respectively. Unlike previous methods such as [77], the proposed counterfactual differential equations continuously incorporate incoming interventions, without interrupting the differential equation. Therefore, we can solve the CDEs using the same techniques as for Neural ODEs. Given an initial latent state z 0 , the generation process for continuous value time series is summarized in Algorithm 7.1. Weusevariationalautoencoderframeworkformodeltrainingandcounterfactual inference. This requires estimating the approximate posterior q(z 0 |{x i ,t i } N i=0 , a ≤t ). Inspired by [77], we use RNN together with CDE and propose the CDE-RNN model to incorporate time series observations{x i ,t i } N i=0 during encoding. The proposed CDE-RNN, summarized in Algorithm 7.2, would be an effective way to handle 99 Algorithm 7.1: Generation process of the latent CDE model. 1 Input: A distribution of initial latent state p(z 0 ); timestamps of interest {t i } N i=1 ; continuous-time interventions a ≤t . 2 Output: Time series observations at desired timestamps{(x i ,t i )} N i=1 and corresponding latent states{(z i ,t i )} N i=1 . 1: Sample z 0 ∼p(z 0 ). 2: Compute z 1 ,..., z N via ODESlove(f z ,f a , z 0 , a ≤t ,{t i } N i=1 ) 3: for i = 1, ..., N do 4: Compute x i , Σ x i =f x (z i ;θ x ) 5: Sample x i ∼N ( x i , Σ x i ) 6: end for 7: return{x i , z i } N i=1 Algorithm 7.2: The CDE-RNN encoder for general cases. 1 Input: Time series observations and their timestamps{(x i ,t i )} N i=1 , the continuous-time interventions process a ≤t . 2 Output: Hidden states and their timestamps{(h i ,t i )} N i=1 1: Set h 0 = 0 2: for i = 1, ..., N do 3: Update h 0 i = ODESlove(g h ,g a , (t i−1 ,t i ), h i−1 ,{a(s) :t i−1 ≤s<t i }) 4: Update h i = RNNCell(h 0 i , x i ) 5: end for 6: return{h i } N i=1 irregularly-sampled time series. To get the approximate posterior of initial latent state z 0 at time point t 0 , we run the CDE-RNN encoder backwrads-in-time from t N to t 0 . Then we represent the approximate posterior with Gaussian random variables depending on the final hidden state of an CDE-RNN: q z 0 |{x i ,t i } N i=1 , a ≤t =N z 0 , Σ z 0 (7.7) where z 0 , Σ z 0 =g z CDE-RNN φ ({x i ,t i } N i=1 , a ≤t ) 100 Here g z is a neural network mapping the final hidden state of the CDE-RNN encoder into the mean and variance of the approximate posterior of z 0 . Following autoencoders framework, we jointly learn both the CDE-RNN encoder and CDE decoder by maximizing the evidence lower bound (ELBO): L ELBO (θ,φ) =E z 0 ∼q(z 0 |{x i ,t i } N i=1 ,a ≤t ) [logp θ (x 1 ,..., x N )] −[q z 0 |{x i ,t i } N i=1 , a ≤t ||p(z 0 )] (7.8) Although with incoming interventions, the whole model is still a ODE-based sequence- to-sequence model. Therefore, we use the adjoint-based backpropagation for training [8]. The overall learning procedure of NSDE is summarized in Algorithm 7.3. 7.5 Experiments We evaluate the proposed method with two experiments, including a realistic tumor growth simulation [23] and a real-world large-scale dataset of ICU patients with sepsis, which is extracted from MIMIC-III dataset [42]. In both experiments, continuous-time processes are governing the time series and the reactions to interven- tions. What we observe are irregularly-spaced time series and interventions. Through experiments, we answer the following questions: (1) How is the performance of the proposed model for irregularly-spaced time series counterfactual inference, compared to existing state-of-the-art methods? (2) How would the number of auxiliary variables affect the performance in the presence of hidden confounders? (3) Do the learned auxiliary variables provide any insight into the underlying time series generation mechanism? 101 Algorithm 7.3: Learning process of NSDE with variational inference. 1 Input: A set of time series along with continuous-time intervention process D; initial value of parameter (θ,φ). 1: while not converged do 2: Choose a time series with timestamps{(x i ,t i )} N i=1 ∈D and continuous-time intervention process a ≤t ∈D. 3: Initialize auxiliary variables z i with all zero vectors for i = 1, 2,...,N. 4: Augment time series x i = x i z i for i = 1, 2,...,N. 5: Initialize hidden state h N with all zero vectors. 6: for i = N-1, ..., 0 do 7: Update h 0 i = ODESlove(g h ,g a , (t i ,t t+1 ), h i+1 ,{a(s) :t i ≤s<t i+1 }) 8: Update h i = RNNCell(h 0 i , x i ) 9: end for 10: Compute z 0 , Σ z 0 =g z (h 0 ;φ z ) 11: Sample z 0 ∼p(z 0 ). 12: Compute z 1 ,..., z N = ODESlove(f z ,f a , z 0 , a ≤t ,{t i } N i=1 ) 13: for i = 1, ..., N do 14: Compute x i , Σ x i =f x (z i ;θ x ) 15: Sample x i ∼N ( x i , Σ x i ) 16: end for 17: Compute the gradient ofL ELBO (θ,φ) as shown in Equation (7.8). 18: Update (θ,φ) with Adam optimizer. 19: end while Baselines We compared the proposed method with 3 competitive baselines in the time series counterfactual inference task, including Counterfactual Gaussian Process (CGP) [84], Recurrent Marginal Structural Network (RMSN) [54] and Time Series Deconfounder with RMSN (TSD-RMSN) [4]. To demonstrate the effectiveness of augmented space in ACODE, we remove all auxiliary variables (k=0) from ACODE for ablation comparisons. Performance Criterion We compute the root mean square error (RMSE) and normalized root mean square error (NRMSE) for each time series averaging across 102 inference time horizon. For each experiment setting, we repeat 20 times and compute the standard deviation as a measure of inference variance. ImplementationDetails Forbaselinesonly workwith discrete-timeinterventions, we discretize the continuous-time intervention process with the same timestamps as time series observations{(x i ,t i )} N i=1 , i.e.{a i : a i = a(t i )} N i=1 . For baselines designed for regularly-spaced time series, like RMSN, TSD-RMSN, we use Gaussian kernel interpolation as the bridge to transfer irregularly sampled time series to its regularly- spaced counterpart. We choose the best kernel parameters via cross-validation. We use Gaussian distribution with diagonal covariance for the distribution of latent state z and time-series observation x. All neural network mappings are parametrized with 3-layer MLPs and ReLU activation. For all neural network based methods, we use a similar amount of parameters for a fair comparison. We randomly split each dataset into the training/validation/test set, and choose hyperparameters e.g. the number of hidden factors of TSD-RNN based on the validation set. 7.5.1 Tumor Growth Simulation To show the effectiveness of the proposed method, we first evaluate it in a simulated environment - the pharmacokinetic-pharmacodynamic (PK-PD) model of tumor growth under the effects of chemotherapy and radiotherapy proposed by 103 1 2 3 4 5 Counterfactual Inference Time Horizon (days) 0.75 1.00 1.25 1.50 1.75 2.00 2.25 2.50 2.75 3.00 NRMSE CGP RMSN TSD-RMSN ACODE (k=0) ACODE (k=1) ACODE (k=3) ACODE (k=5) Figure 7.2: Normalized RMSE curve of counterfactual inference for treatment response on tumor growth. Dash lines represent three baselines, while solid lines represent ACODEs with different number of auxiliary variables. Although ACODEs support continuous-time prediction, we evaluate all methods at discrete timestamps for a fair comparison. Geng et al. geng2017prediction. The tumor volume after t days since diagnosis is modeled as follows: ˙ V (t) =ρ log K V (t) | {z } Tumor growth − β c C(t) | {z } Chemotherapy − α r d(t) +β r d(t) 2 | {z } Radiotherapy + e t |{z} Noise (7.9) where parameter set{K,ρ,β c ,α r ,β r ,e t } are sampled as described in Geng et al. geng2017prediction. Radiotherapy and chemotherapy prescriptions are modeled as Bernoulli random variables which depend on the tumor size V (t). Specifically, 104 Patient Type 1 Patient Type 2 Patient Type 3 Figure 7.3: TSNE visualization of learned auxiliary variables sequence z t for all three types of patients. As shown in the figure, learned auxiliary variables can be clustered into three groups corresponding to three patient types. we assume the chemotherapy prescriptions and radiotherapy prescriptions have probabilities p c (t) and p d (t) respectively that are a functions of the tumor diameter: p c (t) =σ γ c D max (D(t)−θ c ) (7.10) p d (t) =σ γ d D max (D(t)−θ d ) (7.11) where D(t) is the average tumor diameter over the past 15 days, σ(·) is the sigmoid activation function, θ c and θ d are constant parameters, and γ controls the degree of time-dependent confounding. In the previous study on the tumor growth [54], the prior means of β c and α r are adjusted according to three patient types accounting for patient heterogeneity due to genetic features. The patient group corresponds to 105 different parameter setting and thus affects the tumor growth and subsequently the treatment plan. In this experiment, we treat the tumor size as time series observation x(t), patient types as the hidden confounder, treatment plan of chemotherapy and radiotherapy as the intervention a(t). Our task is to predict tumor growth progress under various treatment plans, without any information about patient types. To generate irregularly-spaced time series, we randomly sampled observation intervals between 1 and 5 days. We simulated data with 10000 patients for training, 1000 for validation, and 1000 for testing. We set the number of auxiliary variables in the augmented space k∈{0, 1, 3, 5}, and evaluate treatment response inference on tumor size with NRMSE and standard deviations. Figure 7.2 shows the counterfactual inference accuracy on tumor size across a time horizon of 5 days. We observe that methods considering hidden confounders including the proposed ACODE (k> 0) and TSD-RMSN outperform other baselines that assume all confounders are observed such as CGP, RMSN, and ACODE (k = 0). Further, with the representation power of neural networks and differential equations, ACODEexcelsallothercompetitivebaselinesintermsofinferenceaccuracy, especially for long-term inference. Although auxiliary variables in the ACODE can effectively reduce the inference bias caused by hidden confounders, too many auxiliary variables may introduce unnecessary variance without much help on inference accuracy. As we can see ACODE (k = 5) has similar accuracy comparing with ACODE (k = 3) but suffers from a higher variance. So far we have been treating the auxiliary variables z t as mathematical auxiliary component. Sinceweforcethemodeltolearnthesystemmechanismintheaugmented space x t = h xt zt i , we would like to know whether there is insight learned by auxiliary 106 variables z t . Therefore, we randomly choose 1500 patients and project time series of auxiliary variables learned by ACODE (k = 3) on the two dimensional plane using t-SNE [59], as shown in 7.3. As we can see, the learned auxiliary variables can be clustered into three groups corresponding to three different patient types. This pattern demonstrates the potential of ACODE to provide insights of hidden confounders via learned auxiliary variables z t . 7.5.2 Intensive Care of Patients with Sepsis Next, we evaluate the proposed method in a real-world scenario without full understanding of hidden confounders - electronic health records of sepsis patients in ICU with three treatment options: antibiotics, vasopressors, and mechanical ventilator. We use the electronic health records extracted from Medical Information Mart for Intensive Care (MIMIC III) database [42]. Follow a similar pipeline in [4], we extracted 25 patient covariates consisting of lab tests and vital signs for each patient. In this experiment, we would like to infer the effects of antibiotics, vasopressors, and mechanical ventilators on three patient covariates: white blood cell count, blood pressure, oxygen saturation. Specifically, we extract patient records up to 50 days from MIMIC III database for training and testing, and infer treatment response in 24 hours. The hidden confounders include comorbidities and lab tests that are recorded in MIMIC III database but not used by counterfactual inference models. There might be other hidden confounders, given that it is a real-world scenario. Since the records in MIMIC III are irregularly-spaced, we applied the interpolation to time series for RNN-based baselines (RMSN) [54] and Time Series Deconfounder with RMSN (TSD-RMSN) [4], as described in the previous section. 107 Table 7.1: Average RMSE×10 2 and standard error with 10 runs for the inference of sepsis patients on white blood cell count (WBC), blood pressure (BP), and oxygen saturation (OS) . Methods WBC BP OS CGP 2.55± 0.05 9.41± 0.06 1.32± 0.04 RMSN 3.52± 0.07 11.43± 0.14 1.98± 0.03 TSD-RMSN 2.87± 0.08 9.86± 0.12 1.42± 0.09 ACODE (k = 0) 2.54± 0.05 9.36± 0.04 1.15± 0.03 ACODE (k = 1) 2.36± 0.04 9.28± 0.06 1.08± 0.05 ACODE (k = 5) 2.26± 0.05 8.91± 0.06 1.01± 0.04 ACODE (k = 10) 2.37± 0.10 9.11± 0.14 0.98± 0.09 As we can see in Table 7.1, the proposed ACODE model outperforms other competitive baselines. Also, there is a clear performance gap between methods considering hidden confounders like ACODE (k> 0) and TSD-RMSN, and methods ignoring them like CGP and RMSN. Among all ACODE baselines, with the increasing amount of auxiliary variables, ACODE gets lower RMSE at the cost of higher variance. This pattern is consistent with what we observed in the tumor growth simulation. The proper number of auxiliary variables k ∗ varies for individual applications and needs to be tuned as a hyperparameter. 7.6 Summary We proposed augmented controlled ordinary differential equations (ACODEs) – a novel neural network based model for time series counterfactual inference. The proposed method introduces auxiliary variables and lifts time series into an aug- mented space to reduce the inference bias caused by hidden confounders. With the representation power of neural networks and differential equations, it can effectively 108 capture underlying temporal dynamics and intervention effects from observational data. Empirically, we show that ACODEs outperform existing methods on inference accuracy in both simulations and real-world applications, and showed its potential to provide insights into time series generation mechanisms via auxiliary variables. With the increasing amount of data we collect every day, ACODE would empower us to answer “what if” questions regarding time series and discover insights of under- lying mechanisms behind time-series observations and possible interventions. For future work, it would be worth exploring using the ACODE as the interface between machine learning and dominant modeling paradigm described in differential equations and incorporate well-understood domain knowledge into time series counterfactual inference. 109 Chapter 8 Conclusion Increasing amounts of data and compute resources have in many ways ignited the interest and ongoing progress in deep neural network research of time series. The contributions in this thesis merge the data-driven deep learning approach with the probabilistic graphical model framework, targeting time series counterfactual inference. The contributions can be broadly grouped as • Introduce deep generative models for time series counterfactual inference, including mathematical formulation of the problem and learning and inference pipelines. • Developdeepgenerativemodelsfortimeserieswithvariousstructures, including timeserieswithmulti-agentinteractionsandwithmixedsamplingrates. Special components and mechanisms are proposed to leverage the complex temporal structure. • Connect differential equations with deep generative models, which supports continuous-time time series counterfactual inference. Our methods have limitations. First, with neural networks, the proposed methods usually end up with complicated or heavy models. Especially for those less structured time series, the proposed deep generative models tend to overfit the data. Second, compared with Gaussian Process based methods, deep generative 110 models cannot provide a solid uncertainty measurement about its predictions. This will causes trustworthy issues in real-world deployment in certain area such as medical applications. Third, the validity of the time series deep generative model is conditioned upon a set of assumptions such as the latent state assumption. In general, these assumptions are not testable. The reliability of approaches using counterfactual models therefore critically depends on the plausibility of those assumptions in light of domain knowledge. Albeit deep generative models hold great potential for advancing time series modeling, it has proven more challenging and less developed than other domains. We hope that the above contributions can be used in future research advancing the field further. On the other hand, it is still not ready to deploy such models in real-world scenarios such as counterfactual inference in the medical domain. Major concerns mainly come from the lack of safety, accountability, and transparency which poses new challenges in inference and learning as well as opening new avenues for future research. One promising direction is to leverage domain knowledge. Currently, most existing work for time series counterfactual inference are purely data-driven without leveraging any domain knowledge. On the other hand, domain knowledge discovered by domain experts are valuable information that is complimentary to data-driven methods. There are huge potential benefits to incorporate domain knowledge into data-driven time series counterfactual inference. For example, one may leverage mechanistic understanding of cardiovascular system discovered by doctors and biology scientists in treatment response estimation of heart disease. Although promising, how to effectively incorporate domain knowledge into data-driven models still remains an active research field in artificial intelligent. Several attempts include: 111 • For neural networks, tailor the architecture to be suitable for the tasks. If possible, incorporate some known properties, such as Symmetry of Neural Networks. Logic, equations, and temporal nature can be, respectively, reflected in the structure of networks by combining with special architectures or other symbolic AI. • Augment data to incorporate invariant properties or knowledge-based models. Augmentation can be done by transforming, manipulating, or synthesizing the original data. Simulations built upon knowledge-based models can be used to generate data. Machines do not need to be trained from scratch. Instead, pre-training with massive simulated data can teach data-driven models about domain knowledge. • Design algorithms to include humans in the loop. The interaction between machine and environment can be modeled and optimized in Reinforcement Learning. Humans can be asked to label data or provide distribution. In addition, this also help humans understand machine learning results and then adjust models during or after training. Currently, the knowledge integrated in machine learning is relatively concrete and mostly at the instance level, e.g. expressing each theorem as a constraint. Despite the efforts and achievements to make knowledge generic and broad, we have not seen a successful model to grasp abstract concepts or systematic theories. We believe integrating higher level features is an essential path toward strong artificial intelligence and would change the paradigm to integrate knowledge and thus bring counterfactual inference into next level. 112 In this thesis, we focus on developing deep generative models for reliable time series counterfactual inference. It is our belief that the framework of counterfactual approach is general and can be extended to support a wide range of decision-making problems. We are looking forward to seeing future progress in this area. 113 Reference List [1] Alahi, A., Goel, K., Ramanathan, V., Robicquet, A., Fei-Fei, L., and Savarese, S. (2016). Social lstm: Human trajectory prediction in crowded spaces. In CVPR. [2] Armesto, L., Tornero, J., and Vincze, M. (2008). On multi-rate fusion for non- linear sampled-data systems: Application to a 6d tracking system. Robotics and Autonomous Systems, 56(8):706–715. [3] Battaglia, P., Pascanu, R., Lai, M., Rezende, D. J., et al. (2016). Interaction networks for learning about objects, relations and physics. In NIPS, pages 4502– 4510. [4] Bica, I., Alaa, A. M., and van der Schaar, M. (2019). Time series deconfounder: Estimating treatment effects over time in the presence of hidden confounders. arXiv preprint arXiv:1902.00450. [5] Box, G. E., Jenkins, G. M., Reinsel, G. C., and Ljung, G. M. (2015). Time series analysis: forecasting and control. John Wiley & Sons. [6] Che, Z., Purushotham, S., Cho, K., Sontag, D., and Liu, Y. (2016). Recurrent neural networks for multivariate time series with missing values. arXiv preprint arXiv:1606.01865. [7] Che, Z., Purushotham, S., Li, G., Jiang, B., and Liu, Y. (2018). Hierarchical deep generative models for multi-rate multivariate time series. In ICML, pages 783–792. [8] Chen, T. Q., Rubanova, Y., Bettencourt, J., and Duvenaud, D. K. (2018). Neural ordinary differential equations. In Advances in neural information processing systems, pages 6571–6583. [9] Chipman, H. A., George, E. I., McCulloch, R. E., et al. (2010). Bart: Bayesian additive regression trees. The Annals of Applied Statistics, 4(1):266–298. 114 [10] Chung, J., Ahn, S., and Bengio, Y. (2016). Hierarchical multiscale recurrent neural networks. arXiv preprint arXiv:1609.01704. [11] Chung, J., Gulcehre, C., Cho, K., and Bengio, Y. (2014). Empirical evalua- tion of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555. [12] Chung, J., Kastner, K., Dinh, L., Goel, K., Courville, A. C., and Bengio, Y. (2015). A recurrent latent variable model for sequential data. In Advances in neural information processing systems, pages 2980–2988. [13] Cole, S. R. and Frangakis, C. E. (2009). The consistency statement in causal inference: a definition or an assumption? Epidemiology, 20(1):3–5. [14] De Boor, C., De Boor, C., Mathématicien, E.-U., De Boor, C., and De Boor, C. (1978). A practical guide to splines, volume 27. Springer-Verlag New York. [15] Dehejia, R. H. and Wahba, S. (2002). Propensity score-matching methods for nonexperimental causal studies. Review of Economics and statistics, 84(1):151–161. [16] Drolet, L., Michaud, F., and Côté, J. (2000). Adaptable sensor fusion using multiple kalman filters. In Intelligent Robots and Systems, 2000.(IROS 2000). Proceedings. 2000 IEEE/RSJ International Conference on, volume 2, pages 1434– 1439. IEEE. [17] Duckworth, D. (2013). pykalman, an implementation of the kalman filter, kalman smoother, and em algorithm in python. https://pykalman.github.com. [18] Durbin, J. and Koopman, S. J. (2012). Time series analysis by state space methods, volume 38. OUP Oxford. [19] El Hihi, S. and Bengio, Y. (1995). Hierarchical recurrent neural networks for long-term dependencies. In NIPS. [20] Fitzmaurice, G., Davidian, M., Verbeke, G., and Molenberghs, G. (2008). Longitudinal data analysis. CRC press. [21] Fraccaro, M., Sønderby, S. K., Paquet, U., and Winther, O. (2016). Sequential neural models with stochastic layers. In NIPS. [22] Gan, Z., Li, C., Henao, R., Carlson, D. E., and Carin, L. (2015). Deep temporal sigmoid belief networks for sequence modeling. In NIPS. [23] Geng, C., Paganetti, H., and Grassberger, C. (2017). Prediction of treatment response for combined chemo-and radiation therapy for non-small cell lung cancer patients using a bio-mathematical model. Scientific reports, 7(1):1–12. 115 [24] Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep learning. MIT Press. [25] Greff, K., Srivastava, R. K., Koutník, J., Steunebrink, B. R., and Schmidhuber, J. (2016). Lstm: A search space odyssey. IEEE transactions on neural networks and learning systems, 28(10):2222–2232. [26] Grover, A., Al-Shedivat, M., Gupta, J. K., Burda, Y., and Edwards, H. (2018). Learning policy representations in multiagent systems. arXiv:1806.06464. [27] Gupta, A., Johnson, J., Fei-Fei, L., Savarese, S., and Alahi, A. (2018). Social gan: Socially acceptable trajectories with generative adversarial networks. In CVPR. [28] Hatjispyros, S. J. and Merkatas, C. (2015). Bayesian nonparametric reconstruc- tion and prediction of nonlinear dynamic systems with geometric stick breaking noise. arXiv preprint arXiv:1511.00154. [29] Helbing, D. and Molnar, P. (1995). Social force model for pedestrian dynamics. Physical review E, 51(5):4282. [30] Hill, J. L. (2011). Bayesian nonparametric modeling for causal inference. Journal of Computational and Graphical Statistics, 20(1):217–240. [31] Hirano, K. and Imbens, G. W. (2001). Estimation of causal effects using propensity score weighting: An application to data on right heart catheterization. Health Services and Outcomes research methodology, 2(3-4):259–278. [32] Hochreiter, S. and Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8):1735–1780. [33] Holland, P. W. (1986). Statistics and causal inference. Journal of the American statistical Association, 81(396):945–960. [34] Hoover, K. D. (2006). Causality in economics and econometrics. Available at SSRN 930739. [35] Horvitz, D. G. and Thompson, D. J. (1952). A generalization of sampling without replacement from a finite universe. Journal of the American statistical Association, 47(260):663–685. [36] Hoshen, Y. (2017). Vain: Attentional multi-agent predictive modeling. In NIPS, pages 2701–2711. [37] Houtekamer, P. L. and Mitchell, H. L. (2001). A sequential ensemble kalman filter for atmospheric data assimilation. Monthly Weather Review, 129(1):123–137. 116 [38] Imbens, G. W. and Rubin, D. B. (2015). Causal inference in statistics, social, and biomedical sciences. Cambridge University Press. [39] Ivanovic, B., Schmerling, E., Leung, K., and Pavone, M. (2018). Generative modeling of multimodal multi-human behavior. arXiv:1803.02015. [40] Johansson, F., Shalit, U., and Sontag, D. (2016). Learning representations for counterfactual inference. In International conference on machine learning, pages 3020–3029. [41] Johnson, A., Pollard, T., Shen, L., Lehman, L., Feng, M., Ghassemi, M., Moody, B., Szolovits, P., Celi, L., and Mark, R. (2016a). Mimic-iii, a freely accessible critical care database. Scientific Data. [42] Johnson, A. E., Pollard, T. J., Shen, L., Li-Wei, H. L., Feng, M., Ghassemi, M., Moody, B., Szolovits, P., Celi, L. A., and Mark, R. G. (2016b). Mimic-iii, a freely accessible critical care database. Scientific data, 3(1):1–9. [43] Jordan, M. I. (1998). Learning in graphical models, volume 89. Springer Science & Business Media. [44] Kalman, R. E. et al. (1960). A new approach to linear filtering and prediction problems. Journal of basic Engineering, 82(1):35–45. [45] Kingma, D. and Ba, J. (2014). Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980. [46] Kingma, D. P. and Welling, M. (2013). Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114. [47] Kipf, T., Fetaya, E., Wang, K.-C., Welling, M., and Zemel, R. (2018). Neural relational inference for interacting systems. arXiv:1802.04687. [48] Krishnan, R. G., Shalit, U., and Sontag, D. (2015). Deep kalman filters. arXiv preprint arXiv:1511.05121. [49] Krishnan, R.G., Shalit, U., andSontag, D.(2016). Structuredinferencenetworks for nonlinear state space models. arXiv preprint arXiv:1609.09869. [50] Kroschinsky, F., Stölzel, F., von Bonin, S., Beutel, G., Kochanek, M., Kiehl, M., Schellongowski, P., et al. (2017). New drugs, new toxicities: severe side effects of modern targeted and immunotherapy of cancer and their management. Critical Care, 21(1):89. 117 [51] Le, H. M., Yue, Y., Carr, P., and Lucey, P. (2017). Coordinated multi-agent imitation learning. In Proceedings of the 34th International Conference on Machine Learning-Volume 70, pages 1995–2003. JMLR. org. [52] Lee, N., Choi, W., Vernaza, P., Choy, C. B., Torr, P. H., and Chandraker, M. (2017). Desire: Distant future prediction in dynamic scenes with interacting agents. In CVPR, pages 336–345. [53] Li, G., Jiang, B., Zhu, H., Che, Z., and Liu, Y. (2020). Generative attention networks for multi-agent behavioral modeling. In Thirty-Fourth AAAI Conference on Artificial Intelligence. [54] Lim, B. (2018). Forecasting treatment responses over time using recurrent marginal structural networks. In Advances in Neural Information Processing Systems, pages 7483–7493. [55] Linou, K. (2016). NBA dataset from raw SportVU logs. https://github.com/ linouk23/NBA-Player-Movements. [56] Liu, Y., Yu, R., Zheng, S., and Yue, Y. (2018). Long range sequence generation via multiresolution adversarial training. [57] Lok, J. J. et al. (2008). Statistical modeling of causal effects in continuous time. The Annals of Statistics, 36(3):1464–1507. [58] Maaten, L. v. d. (2008). t-distributed stochastic neighbor embed- ding in scikit-learn. https://scikit-learn.org/stable/modules/generated/ sklearn.manifold.TSNE.html. [59] Maaten, L. v. d. and Hinton, G. (2008). Visualizing data using t-sne. Journal of machine learning research, 9(Nov):2579–2605. [60] Manh, H. and Alaghband, G. (2018). Scene-lstm: A model for human trajectory prediction. arXiv:1808.04018. [61] Mazumder, R., Hastie, T., and Tibshirani, R. (2010). Spectral regularization algorithms for learning large incomplete matrices. Journal of machine learning research, 11(Aug):2287–2322. [62] McCaffrey, D. F., Ridgeway, G., and Morral, A. R. (2004). Propensity score estimation with boosted regression for evaluating causal effects in observational studies. Psychological methods, 9(4):403. [63] Mehran, R., Oyama, A., and Shah, M. (2009). Abnormal crowd behavior detection using social force model. In CVPR, pages 935–942. IEEE. 118 [64] Menne, M., Williams Jr, C., and Vose, R. (2010). Long-term daily and monthly climate records from stations across the contiguous united states. [65] Merkatas, C. and Särkkä, S. (2021). System identification using bayesian neural networks with nonparametric noise models. arXiv preprint arXiv:2104.12119. [66] Nair, V. and Hinton, G. E. (2010). Rectified linear units improve restricted boltzmann machines. In ICML. [67] Negenborn, R. (2003). Robot localization and Kalman filters. PhD thesis, Utrecht University. [68] Neil, D., Pfeiffer, M., and Liu, S.-C. (2016). Phased lstm: Accelerating recur- rent network training for long or event-based sequences. In Advances in Neural Information Processing Systems, pages 3882–3890. [69] Pearl, J. et al. (2009). Causal inference in statistics: An overview. Statistics surveys, 3:96–146. [70] Rabiner, L. R. (1989). A tutorial on hidden markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2):257–286. [71] Rezende, D. J., Mohamed, S., and Wierstra, D. (2014). Stochastic backprop- agation and approximate inference in deep generative models. arXiv preprint arXiv:1401.4082. [72] Robicquet, A., Sadeghian, A., Alahi, A., and Savarese, S. (2016). Learning social etiquette: Human trajectory understanding in crowded scenes. In ECCV. [73] Robins, J. M. (1994). Correcting for non-compliance in randomized trials using structural nested mean models. Communications in Statistics-Theory and methods, 23(8):2379–2412. [74] Robins, J. M., Hernan, M. A., and Brumback, B. (2000a). Marginal structural models and causal inference in epidemiology. [75] Robins, J. M., Rotnitzky, A., and Scharfstein, D. O. (2000b). Sensitivity analysis for selection bias and unmeasured confounding in missing data and causal inference models. In Statistical models in epidemiology, the environment, and clinical trials, pages 1–94. Springer. [76] Rosenbaum, P. R. and Rubin, D. B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70(1):41–55. 119 [77] Rubanova, Y., Chen, T. Q., and Duvenaud, D. K. (2019). Latent ordinary differential equations for irregularly-sampled time series. In Advances in Neural Information Processing Systems, pages 5321–5331. [78] Rubin, D. B. (1974). Estimating causal effects of treatments in randomized and nonrandomized studies. Journal of educational Psychology, 66(5):688. [79] Rubin, D. B. (1978). Bayesian inference for causal effects: The role of random- ization. The Annals of statistics, pages 34–58. [80] Rubin, D. B. (1980). Randomization analysis of experimental data: The fisher randomization test comment. Journal of the American Statistical Association, 75(371):591–593. [81] Rubin, D. B. (1986). Comment: Which ifs have causal answers. Journal of the American statistical association, 81(396):961–962. [82] Rubin, D. B. (2005). Causal inference using potential outcomes: Design, model- ing, decisions. Journal of the American Statistical Association, 100(469):322–331. [83] Safari, S., Shabani, F., and Simon, D. (2014). Multirate multisensor data fusion for linear systems using kalman filters and a neural network. Aerospace Science and Technology, 39:465–471. [84] Schulam, P. and Saria, S. (2017). Reliable decision support using counterfactual models. In Advances in Neural Information Processing Systems, pages 1697–1708. [85] Seabold, S. and Perktold, J. (2010). Statsmodels: Econometric and statistical modeling with python. In Proceedings of the 9th Python in Science Conference. [86] Shadish, W. R., Cook, T. D., Campbell, D. T., et al. (2002). Experimental and quasi-experimental designs for generalized causal inference/William R. Shedish, Thomas D. Cook, Donald T. Campbell. Boston: Houghton Mifflin,. [87] Sjöberg, J., Hjalmarsson, H., and Ljung, L. (1994). Neural networks in system identification. IFAC Proceedings Volumes, 27(8):359–382. [88] Sjöberg, J., Zhang, Q., Ljung, L., Benveniste, A., Delyon, B., Glorennec, P.-Y., Hjalmarsson, H., and Juditsky, A. (1995). Nonlinear black-box modeling in system identification: a unified overview. Automatica, 31(12):1691–1724. [89] Soleimani, H., Subbaswamy, A., andSaria, S.(2017). Treatment-response models for counterfactual reasoning with continuous-time, continuous-valued interventions. arXiv preprint arXiv:1704.02038. 120 [90] Splawa-Neyman, J., Dabrowska, D. M., and Speed, T. (1990). On the application of probability theory to agricultural experiments. essay on principles. section 9. Statistical Science, pages 465–472. [91] Stekhoven, D. J. and Bühlmann, P. (2011). Missforest—non-parametric missing value imputation for mixed-type data. Bioinformatics, 28(1):112–118. [92] Sukhbaatar, S., Fergus, R., et al. (2016). Learning multiagent communication with backpropagation. In NIPS. [93] VanderWeele, T. J. (2009). Concerning the consistency assumption in causal inference. Epidemiology, 20(6):880–883. [94] Vemula, A., Muelling, K., and Oh, J. (2018). Social attention: Modeling attention in human crowds. In ICRA, pages 1–7. IEEE. [95] Vlachostergios, P. J. and Faltas, B. M. (2018). Treatment resistance in urothe- lial carcinoma: an evolutionary perspective. Nature Reviews Clinical Oncology, 15(8):495–509. [96] Wang, Y. and Blei, D. M. (2019). The blessings of multiple causes. Journal of the American Statistical Association, 114(528):1574–1596. [97] White, I. R., Royston, P., and Wood, A. M. (2011). Multiple imputation using chained equations: issues and guidance for practice. Statistics in medicine, 30(4):377–399. [98] Xu, Y., Piao, Z., and Gao, S. (2018). Encoding crowd interaction with deep neural network for pedestrian trajectory prediction. In CVPR, pages 5275–5284. [99] Yeh, R. A., Schwing, A. G., Huang, J., and Murphy, K. (2019). Diverse generation for multi-agent sports games. In CVPR, pages 4610–4619. [100] Zhan, E., Zheng, S., Yue, Y., and Lucey, P. (2018a). Generative multi-agent behavioral cloning. arXiv:1803.07612. [101] Zhan, E., Zheng, S., Yue, Y., Sha, L., and Lucey, P. (2018b). Generating multi-agent trajectories using programmatic weak supervision. arXiv preprint arXiv:1803.07612. [102] Zhu, F. (2019). On Causal Discovery and Inference from Observational Data. PhD thesis. [103] Zhu, F., Lin, A., Zhang, G., and Lu, J. (2018). Counterfactual inference with hidden confounders using implicit generative models. In Australasian Joint Conference on Artificial Intelligence, pages 519–530. Springer. 121
Abstract (if available)
Abstract
Decision-makers want to know how to produce desired outcomes and act accordingly, which requires a causal understanding of cause and effect. The last decades have seen an explosion in the availability of time series data and computational resources. This has spurred a large interest in data-driven algorithms able to provide counterfactual inference. ❧ As Richard Feynman said, "What I cannot create, I do not understand.", a classical way for counterfactual inference is to understand the underlying working mechanism of the system. Recently Deep Neural Networks have been fundamental in pushing the machine learning field forward with remarkable results in image classification, speech analysis, and machine translation. They also provide a great potential of learning data-generating mechanisms behind time series and thus support counterfactual inference. ❧ This thesis explores deep generative models of time series data, with a focus on time series counterfactual inference. Given a system of interest and its time-series observation, our goal is to learn the underlying data generating mechanism with deep generative models and predict potential outcomes under various circumstances. One of the most interesting developments in generative models is the introduction of models merging the powerful function approximators provided by deep neural networks with the principled probabilistic approach from graphical models. Here the Variational Autoencoder (VAE) is a seminal contribution to this emerging field. In this thesis, we develop extensions and improvements to the VAE framework to leverage more complex time series and perform effective counterfactual inference.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Deep learning models for temporal data in health care
PDF
Ultra-low-latency deep neural network inference through custom combinational logic
PDF
Dynamical representation learning for multiscale brain activity
PDF
Scalable multivariate time series analysis
PDF
Learning controllable data generation for scalable model training
PDF
Structured visual understanding and generation with deep generative models
PDF
Deep learning for subsurface characterization and forecasting
PDF
Deep learning techniques for supervised pedestrian detection and critically-supervised object detection
PDF
Diffusion network inference and analysis for disinformation mitigation
PDF
Algorithms and frameworks for generating neural network models addressing energy-efficiency, robustness, and privacy
PDF
Green image generation and label transfer techniques
PDF
Neural sequence models: Interpretation and augmentation
PDF
Physics-based data-driven inference
PDF
Efficient learning: exploring computational and data-driven techniques for efficient training of deep learning models
PDF
Modeling dynamic behaviors in the wild
PDF
Physics-aware graph networks for spatiotemporal physical systems
PDF
Latent space dynamics for interpretation, monitoring, and prediction in industrial systems
PDF
Machine learning techniques for perceptual quality enhancement and semantic image segmentation
PDF
Optimization strategies for robustness and fairness
PDF
Understanding diffusion process: inference and theory
Asset Metadata
Creator
Li, Guangyu
(author)
Core Title
Deep generative models for time series counterfactual inference
School
Viterbi School of Engineering
Degree
Doctor of Philosophy
Degree Program
Electrical Engineering
Degree Conferral Date
2021-12
Publication Date
10/01/2021
Defense Date
07/26/2021
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
counterfactual inference,generative models,neural networks,OAI-PMH Harvest,time series analysis
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Liu, Yan (
committee chair
), Deshmukh, Jyotirmoy (
committee member
), Jain, Rahul (
committee member
), Kuo, C-C. Jay (
committee member
)
Creator Email
fatboygy@outlook.com,guangyul@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-oUC16021759
Unique identifier
UC16021759
Legacy Identifier
etd-LiGuangyu-10130
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Li, Guangyu
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the author, as the original true and official version of the work, but does not grant the reader permission to use the work if the desired use is covered by copyright. It is the author, as rights holder, who must provide use permission if such use is covered by copyright. The original signature page accompanying the original submission of the work to the USC Libraries is retained by the USC Libraries and a copy of it may be obtained by authorized requesters contacting the repository e-mail address given.
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
counterfactual inference
generative models
neural networks
time series analysis