Research
My main research interests target the design and analysis of wireless networked control systems. In such systems the control loops are closed via a communication network. As a result, its study necessitates the interplay between dynamical systems and their communication aspects. Toward this end, my research can be characterized as interdisciplinary because it combines tools from control and estimation theories, communications, and convex optimization. Specifically, my research span the areas of: i) information theory, ii) stochastic control and estimation theory.
Properties of information measures in systems with memory and feedback
In this research we investigate functional and topological properties of directed information and its variants that is known to be a handy information measure that quantifies the information rate in systems with memory and feedback. Directed information from an input process to an output process captures the uncertainty of the latter due to the causal knowledge of the former. In information theory, directed information or its variants are used to characterize capacity of channels with memory and feedback and lossy data compression of causal and zerodelay codes. Moreover, it can be used in network communication systems as a metric for evaluating the capacity of special types of networks, such as, the twoway channel, the multiple access channel, etc. Furthermore, directed information has found usage in a variety of problems subject to causality constraints, such as, gambling, portfolio theory, data compression and hypothesis testing, in biology as an alternative to Granger's measure of causality, and in communication for networked control systems.
Contribution: We derive several functional and topological properties of directed information, defined on general abstract alphabets (complete separable metric spaces), using the topology of weak convergence of probability measures. These include convexity and concavity of directed information, lower semicontinuity of directed information, and under certain conditions continuity. Finally, we derive variational equalities for directed information, including sequential versions. These may be viewed as the analogue of the variational equalities of mutual information utilized in BlahutArimoto algorithms to numerically evaluate the capacity and rate distortion function of channels and sources. In summary, we extend the basic functional and topological properties of mutual information to directed information. These properties are fundamental in the sense that they are necessary in order to investigate the extremum problems of directed information which are the optimization problems of interest in communication systems with memory and feedback and in networked control systems.
Selected Publications  
Book Chapters:
Journal papers:
Conference papers:

Causal and zerodelay processing of information in systems with memory and feedback
In this research we investigate the rate performance and the coding aspects of causal and zerodelay codes (i.e., a zerodelay code is a causal code but not the other way around) in systems with memory and feedback. The information measure that is used to assess these classes of codes is the socalled information nonanticipative or sequential rate distortion function which is a lower bound to the operational causal rate distortion function. An important question in this case is why not to use the operational causal rate distortion function to measure the rate performance and construct efficient coding schemes to convey reliably information instead of using a lower bound? The answer is immediate since by definition the operational causal rate distortion function can be cast as a nonconvex optimization problem. As a result, it is extremely difficult to be solved explicitly. On the other hand, information nonanticipative rate distortion is a convex optimization problem and it can be solved explicitly for a variety of systems including memory and feedback. In such systems, the question of interest is as follows. Using this information measure how close we can get to the operational causal rate distortion function? This research aims into providing insights to this question.
Contribution: We investigate applications of information nonanticipative rate distortion function in causal and zerodelay source coding problems using analog or uncoded transmission based on average and excess distortion probability. We consider two application examples of sources with memory, the binary symmetric Markov source with parameter p, (BSMS(p)), and, the vector GaussMarkov source. For the BSMS(p) with Hamming distance distortion, we obtain the closed form expression of information nonanticipative rate distortion function, and we show achievability using joint sourcechannel coding theory. For the vector GaussMarkov source with average mean squarederror distortion, we obtain the parametric solution of information nonanticipative rate distortion function and show achievability in the context of joint sourcechannel coding applications. By realizing the optimal testchannel of the Gaussian information nonanticipative rate distortion function we show achievability via a noiseless coding scheme that makes use of uniform scalar quantization with subtractive dither. Compellingly, using quantization with dithering, it is possible to show that for infinite dimensional Gaussian sources the information nonanticipative rate distortion function and operational causal (and zerodelay) rate distortion functions coincide.
Selected Publications  
Book Chapters:
Conference papers:

Lowdelay joint sourcechannel coding
In this research we investigate the rate performance and the coding aspects of the ultimate communication scenario, that of merging the source dynamics and the channel dynamics in a unified framework. In such joint sourcechannel communication systems we aim into finding sourcechannel codes where the source and channel probability distributions, and the distortion measure of the source with the input cost function of the channel are favorably matched. Of particular interest in the recent technological advancements like for example 5G communication systems is the requirement for transmission of information in short data packets reliably with lowdelay. This is another target of this research.
Contribution: We investigate applications of information causal rate distortion function in zerodelay joint sourcechannel coding design using uncoded transmission based on average and excess distortion probability. These applications are described using two application examples of stationary ergodic sources with memory; a binary Markov source transmitted over a channel with unit memory with or without feedback, and a vector GaussMarkov source transmitted over a memoryless vector AWGN channel. We also extend the Gaussian sourcechannel communication scenario to timevarying processes. In these application examples we assume that the sources and channels operate with matching number of samples/uses (i.e., no bandwidth constraints). Among others, for the timevarying Gaussian setting, we inspect the information rates performance from the aspect of resource allocations when the GaussMarkov source is probabilistically matched to the noisy channel. Toward this end, we provide an iterative algorithm which can be seen as a joint sourcechannel resource allocation algorithm that performs the matching over time and space. This algorithm ensures that for each transmission, the endtoend system operates at the minimum achievable distortion and shows that the probabilistic matching of the source and the channel prohibits the possibility of having the source transmitting data on higher rates than the ones allowed by the nominal capacity of the noisy channel. As a performance measure of the endtoend dynamical system, we consider and compute the excess distortion probability to upper bound the probability of error at the decoder by showing that this decays exponentially. Moreover, we prove a converse theorem for this dynamical sourcechannel communication scheme. These operational results ensure that the proposed communication system operates reliably after a finite number of transmissions.
Selected Publications  
Book Chapters:
Conference papers:

Feedback capacity of channels with memory
In this research we investigate feedback capacity for a class of channels with memory with or without transmission cost constraints. Computing feedback capacity for any class of channel distributions with memory, with or without transmission cost constraints, computing the optimal strategies that achieves feedback capacity, and determining whether feedback increases capacity, are fundamental and challenging open problems in information and communication theories for half a century. Here we aim into determining the closed form expressions of the optimal strategies for a class of channels with memory without assuming a priori any assumptions of stationarity or ergodicity on the given channel.
Contribution: We derive sequential necessary and sufficient conditions for any channel input probability distribution to maximize directed information for channels and cost functions with memory on the previous channel output symbols. To achieve this goal we apply finite time horizon dynamic programming algorithm and certain known information structures on the capacity achieving distributions. To demonstrate the validity of our results, we apply our framework into several channels with memory with or without transmission cost and we provide closed form expressions of the capacity achieving distributions and the corresponding feedback capacity. For certain channels we observe that feedback does not increase capacity.
Selected Publications  
Journal papers:
Conference papers:

Communication and Coding for Networked Control Systems
In this research we investigate the communication aspects of a simple networked control system. In particular, we are interested in the source coding aspects of the realistic possibility that the channel connecting the observer to the controller might be noisy or noiseless. In the case where the controller is separated from the purely communication part of the closed loop system, we wish to design a lowdelay optimal communication strategy so that at the output of this system the estimated process obtained based on an optimal linear least squares estimator (Kalman filter) satisfies an endtoend average fidelity or distortion criterion.
Contributions: (1) We develop finitetime horizon causal filters using the information causal rate distortion function. We apply the developed framework to design optimal filters for unstable Gaussian processes modeled in state space form with discrete recursions, subject to a mean squared error fidelity constraint. We show that such filters are equivalent to the design of an optimal encoderchanneldecoder, which ensures that the error satisfies a fidelity constraint. Unlike classical Kalman filters, the filter developed is characterized by a reversewaterfilling algorithm, which ensures that the fidelity constraint is satisfied. (2) We derive bounds on the minimum data rate required to achieve a prescribed closedloop performance level in a simple networked control system (NCS). In this NCS the plant is assumed to have a single output measurement and a single control output. From a communication point of view, the NCS exchange data between the encoder and the decoder via a noiseless channel with some fixed delay. The performance of the system is assessed using a linear source coding scheme based on entropy coded dither quantizers followed by instantaneous coding.
Selected Publications  
Book Chapters:
Journals:
Conference papers:
