this post was submitted on 06 Nov 2024
133 points (97.8% liked)

Programming

17343 readers
334 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities !webdev@programming.dev



founded 1 year ago
MODERATORS
 

So I'm no expert, but I have been a hobbyist C and Rust dev for a while now, and I've installed tons of programs from GitHub and whatnot that required manual compilation or other hoops to jump through, but I am constantly befuddled installing python apps. They seem to always need a very specific (often outdated) version of python, require a bunch of venv nonsense, googling gives tons of outdated info that no longer works, and generally seem incredibly not portable. As someone who doesn't work in python, it seems more obtuse than any other language's ecosystem. Why is it like this?

(page 2) 14 comments
sorted by: hot top controversial new old
[–] Balinares@pawb.social 3 points 17 hours ago

It... depends. There is some great tooling for Python -- this was less true only a few years ago, mind you -- but the landscape is very much in flux, and usage of the modern stuff is not yet widespread. And a lot of the legacy stuff has a whole host of pitfalls.

Things are broadly progressing in the right direction, and I'd say I'm cautiously optimistic, although if you have to deal with anything related to conda then for the time being: good luck, and sorry.

[–] DarkThoughts@fedia.io 2 points 17 hours ago (3 children)

Tried to install Automatic1111 for Stable Diffusion in an Arch distrobox, and despite editing the .sh file to point to the older tarballed Python version as advised on Github, it still tells me it uses the most up to date one that's installed system wide and thus can't install pytorch. And that's pretty much where my personal knowledge ends, and apparently that of those (i.e. that one person) on Github. ¯\_(ツ)_/¯

Always funny when people urge you to ask for help but no one ends up actually helping.

[–] tal@lemmy.today 1 points 17 hours ago* (last edited 16 hours ago)

despite editing the .sh file to point to the older tarballed Python version as advised on Github, it still tells me it uses the most up to date one that's installed system wide and thus can't install pytorch.

Can you paste your commands and output?

If you want, maybe on !imageai@sh.itjust.works, since I think that people seeing how to get Automatic1111 set up might help others.

I've set it up myself, and I don't mind taking a stab at getting it working, especially if it might help get others over the hump to a local Automatic1111 installation.

load more comments (2 replies)
[–] it_depends_man@lemmy.world 1 points 16 hours ago (1 children)

The difficulty with python tooling is that you have to learn which tools you can and should completely ignore.

Unless you are a 100x engineer managing 500 projects with conflicting versions, build systems, docker, websites, and AAAH...

  • you don't really need venvs
  • you should not use more than on package manager (I recommend pip) and you should cling to it with all your might and never switch. Mixing e.g. conda, on linux system installers like apt, is the problem. Just using one is fine.
  • You don't "need" need any other tools. They are bonuses that you should use and learn how to use, exactly when you need them and not before. (type hinting checker, linting, testing, etc..)

Why is it like this?

Isolation for reliability, because it costs the businesses real $$$ when stuff goes down.

venvs exists to prevent the case that "project 1" and "project 2" use the same library "foobar". Except, "project 1" is old, the maintainer is held up and can't update as fast and "project 2" is a cutting edge start up that always uses the newest tech.

When python imports a library it would use "the libary" that is installed. If project 2 uses foobar version 15.9 which changed functionality, and project 1 uses foobar uses version 1.0, you get a bug, always, in either project 1 or project 2. Venvs solve this by providing project specific sets of libraries and interpreters.

In practice for many if not most users, this is meaningless, because if you're making e.g. a plot with matplotlib, that won't change. But people have "best practices" so they just do stuff even if they don't need it.

It is a tradeoff between being fine with breakage and fixing it when it occurs and not being fine with breakage. The two approaches won't mix.

very specific (often outdated) version of python,

They are giving you the version that they know worked. Often you can just remove the specific version pinning and it will work fine, because again, it doesn't actually change that much. But still, the project that's online was the working state.

[–] ebc@lemmy.ca 1 points 16 hours ago (2 children)

Coming at this from the JS world... Why the heck would 2 projects share the same library? Seems like a pretty stupid idea that opens you up to a ton of issues, so what, you can save 200kb on you hard drive?

[–] it_depends_man@lemmy.world 3 points 15 hours ago* (last edited 15 hours ago)

Why the heck would 2 projects share the same library?

Coming from the olden days, with good package management, infrequent updates and the idea that you wanted to indeed save that x number of bytes on the disk and in memory, only installing one was the way to go.

Python also wasn't exactly a high brow academic effort to brain storm the next big thing, it was built to be a simple tool and that included just fetching some library from your system was good enough. It only ended up being popular because it is very easy to get your feet wet and do something quick.

[–] jacksilver@lemmy.world 3 points 16 hours ago

Yeah, not sure I would listen to this guy. Setting up a venv for each project is about a bare minimum for all the teams I've worked on.

That being said python env can be GBs in size (especially when doing data science).

[–] ebc@lemmy.ca 1 points 17 hours ago (1 children)

I'm no Python expert either and yeah, from an outsider's perspective it seems needlessly confusing. easy_install that's never been easy, pip that should absolutely be put on a Performance Improvement Plan, and now this venv nonsense.

You can criticize javascript's ridiculous dependencies all you want (left-pad?), but one thing that they absolutely got right is how to manage them. Everything's in node_modules and that's it. Yeah, you might get eleven copies of left-pad on your system, but you know what you NEVER get? Version conflicts between projects you're working on.

[–] moreeni@lemm.ee 1 points 16 hours ago

Seriously. Those are EXACTLY the thoughts I had after I was forced to deal with Python after a ton of time writing projects in JS.

[–] onlinepersona@programming.dev -1 points 14 hours ago* (last edited 14 hours ago)

Difficult? How so? I find compiling C and C++ stuff much more difficult than anything python. It never works on the first try whereas with python the chances are much much higher.

What's is so difficult to understand about virtual envs? You have global python packages, you can also have per user python packages, and you can create virtual environments to install packages into. Why do people struggle to understand this?

The global packages are found thanks to default locations, which can be overridden with environment variables. Virtual environments set those environment variables to be able to point to different locations.

python -m venv .venv/ means python will execute the module venv and tell it to create a virtual environment in the .venv folder in the current directory. As mentioned above, the environment variables have to be set to actually use it. That's when source .venv/bin/activate comes into play (there are other scripts for zsh and fish). Now you can run pip install $package and then run the package's command if it has one.

It's that simple. If you want to, you can make it difficult by doing sudo pip install $package and fucking up your global packages by possibly updating a dependency of another package - just like the equivalent of updating glibc from 1.2 to 1.3 and breaking every application depending on 1.2 because glibc doesn't fucking follow goddamn semver.

As for old versions of python, bro give me a break. There's pyenv for that if whatever old ass package you're installing depends on an ancient 10 year old python version. You really think building a C++ package from 10 years ago will work more smoothly than python? Have fun tracking down all the unlocked dependency versions that "Worked On My Machine 🏧" at the start of the century.

The only python packages I have installing are those with C/C++ dependencies which have to be compiled at install time.

Y'all have got to be meme'ing.

Anti Commercial-AI license

[–] ravhall@discuss.online -3 points 17 hours ago (9 children)

This isn’t the answer you want, but Go(lang) is super easy to learn and has a ton of speed on python. Yes, it’s more difficult, but once you understand it, it’s got a lot going for it.

load more comments (9 replies)
load more comments
view more: ‹ prev next ›