We propose and analyze a class of adaptive sampling algorithms for multimodal distributions on a bounded domain, which share a structural resemblance to the classic overdamped Langevin dynamics. This work makes two key contributions. First, we demonstrate that a class of linear dynamics with adaptive diffusion coefficients can be interpreted and analyzed as weighted Wasserstein gradient flows of the Kullback--Leibler (KL) divergence between the current distribution and the target Gibbs distribution. We establish the exponential convergence of both the KL and $\chi^2$ divergences, with rates depending on the weighted Wasserstein metric and the Gibbs potential. Notably, two specific instances of this class of dynamics are the overdamped Langevin dynamics and derivative-free dynamics. Our second contribution is to show that sampling can be performed without access to the gradient of the Gibbs potential and that for Gibbs distributions with nonconvex potentials, this approach achieves significantly faster convergence than the overdamped Langevin dynamics. A comparison of the mean transition times between local minima of a nonconvex Gibbs potential further highlights the better efficiency of the derivative-free dynamics in sampling. The resulting numerical sampling scheme does not require either density estimation or the normalizing constant of the target distribution.