The paper considers the role of entropy and other information theoretic concepts in the description and formation of joint distributions for m random variables. The discussion includes a review of methods for constructing discrete and continuous joint distributions from the component marginal distributions. We then propose a minimum cross-entropy approach that recovers continuous joint distributions from the joint and marginal moments and the marginal densities. The large-sample properties of the associated estimator are outlined, and a simple demonstration problem that highlights the advantages and drawbacks of the proposed method is presented.