Differential privacy is a recent notion of data privacy protection, which does not matter even when an attacker has arbitrary background knowledge in advance. Consequently, it is viewed as a reliable protection mechanism for sensitive information. Differential privacy introduces Laplace noise to hide the true value in a dataset while preserving statistic properties. However, the large amount of Laplace noise added into a dataset is typically defined by the discursive scale parameter of the Laplace distribution. The privacy parameter ϵ in differential privacy is with theoretical interpretation, but the implication on the risk of data disclosure (called RoD for short) in practice has not yet been studied. Moreover, choosing appropriate value for ϵ is not an easy task since it impacts the level of privacy in a dataset significantly. In this paper, we define and evaluate the RoD in a dataset with either numerical or binary attributes for numerical or counting queries with multiple attributes based on the noise estimation. Through confidence probability of noise estimation, we give a simple way to choose the privacy parameter ϵ. Finally, we show the relation of the RoD and privacy parameter ϵ in experimental results. To the best of our knowledge, this is the first research work in using noise estimation to practically evaluate the RoD for multiple attributes (both numerical and binary data).