In my presentation, I will focus on the topic of testing conditional independence between two discrete random variables, X and Y, given a third discrete variable Z, using an information-theoretic measures, namely Conditional Mutual Information (CMI) and its approximations. I will discuss both the asymptotic and non-asymptotic perspectives on conditional independence testing using CMI and some resampling techniques. Then, I will show the benefits and drawbacks of using approximations of CMI instead of the exact measure. Additionally, I will showcase interesting applications of some particular CMI approximations in testing compound hypotheses of independence between discrete random variables. The results from the presentation will mostly come from my PhD thesis.