I'd say since you're a beginner, it's much better to try to implement your regression functions and any necessary helper functions (train/test split etc...) yourself in the beginning. Learn the necessary linear algebra and quadratic programming and try to implement linear regression, logistic regression and SVMs using only numpy and cvxpy.
Once you get the hang of it, you can jump straight into sklearn and be confident that you understand sort of what those "blackboxes" really do and that will also help you a lot with troubleshooting.
For neural networks and deep learning, pytorch is imposing itself as an industry standard right now. Look up "adjoint automatic differentiation" ("backpropagation" doesn't do it any justice as pytorch instead implements a very general dynamic AAD) and you'll understand the "magic" behind the gradients that pytorch gives you. Karpathy's YouTube tutorials are really good to get an intro to AAD/autodiff in the context of deep learning.