Sums-of-squares (SOS) optimization problems are commonly formulated and solved as semidefinite programs (SDP). While theoretically mostly satisfying, the conventional SDP formulation has two undesirable properties: it requires a quadratic number of auxiliary variables (with detrimental consequences to the complexity of the problem), and it is ill-conditioned. In the first part of the talk, we will discuss alternatives based on nonsymmetric conic optimization methods, resolving the first problem. We then show that the technique can be combined with polynomial interpolation, which improves conditioning and further improves computational complexity. In the second part of the talk, we show that although this approach circumvents the SDP formulation, explicit SOS decompositions (usually computed from the optimal solution of the SDP) can still be computed with little additional effort, and that even rigorous (rational) nonnegativity certificates can be easily computed with this approach, directly from the results of the numerical calculations.