Determining independence of a joint continuous probability distribtuion
  
 
     
     
             
                 
 
 
         
         0 
         
 
         
             
         
 
 
 
 
             
 
             
 
     $begingroup$ 
     
 
 I'm just confused on determining if a joint pdf has independence between it's two variables X and Y.   The theorem given to me is f(x,y) = fx(x)*fy(y)  as a way to test independence, it's easy enough to understand   But another theorem was given to me as an alternative   f(x,y) = g(x)*h(y) where g(x) and h(y) are non-negative function of only x or y, respectively.   How do I determine g(x) and h(y)? Is it just a factored form of f(x,y)? And if that's the case, is independence just determined on whether or not I can factor f(x,y)?   An easy example: f(x,y) = 2y......so g(x)=y, h(y)=2? Likewise, is g(x)=4y and h(y) =.5 viable (although probably unnecessary)?      
 
         
             
                 probability statistics probability-distributions 
             
         
 
     ...