It's because unicode
was really broken, and a lot of the obvious breakage was when people mixed the two. So they did fix some of the obvious breakage, but they left a lot of the subtle breakage (in addition to breaking a lot of existing correct code, and introducing a completely nonsensical bytes
class).
o11c
Python 2 had one mostly-working str
class, and a mostly-broken unicode
class.
Python 3, for some reason, got rid of the one that mostly worked, leaving no replacement. The closest you can get is to spam surrogateescape
everywhere, which is both incorrect and has significant performance cost - and that still leaves several APIs unavailable.
Simply removing str
indexing would've fixed the common user mistake if that was really desirable. It's not like unicode
indexing is meaningful either, and now large amounts of historical data can no longer be accessed from Python.
The problem is that there's a severe hole in the ABCs: there is no distinction between "container whose elements are mutable" and "container whose elements and size are mutable".
(related, there's no distinction for supporting slice operations or not, e.g. deque
)
True, speed does matter somewhat. But even if xterm
isn't the ultimate in speed, it's pretty good. Starts up instantly (the benefit of no extraneous libraries); the worst question is if it's occasionally limited to the framerate for certain output patterns, and if there's a clog you can always minimize it for a moment.
Speed is far from the only thing that matters in terminal emulators though. Correctness is critical.
The only terminals in which I have any confidence of correctness are xterm
and pangoterm
. And I suppose technically the BEL-for-ST extension is incorrect even there, but we have to live with that and a workaround is available.
A lot of terminal emulators end up hard-coding a handful of common sequences, and fail to correctly ignore sequences they don't implement. And worse, many go on to implement sequences that cannot be correctly handled.
One simple example that usually fails: \e!!F
. More nasty, however, are the ones that ignore intermediaries and execute some unrelated command instead.
I can't be bothered to pick apart specific terminals anymore. Most don't even know what an IR is.
and I already explained that Union
is a thing.
That still doesn't explain why duck typing is ever a thing beyond "I'm too lazy to write extends BaseClass
". There's simply no reason to want it.
Then - ignoring dunders that have weird rules - what, pray tell, is the point of protocols, other than backward compatibility with historical fragile ducks (at the cost of future backwards compatibility)? Why are people afraid of using real base classes?
The fact that it is possible to subclass a Protocol
is useless since you can't enforce subclassing, which is necessary for maintainable software refactoring, unless it's a purely internal interface (in which case the Union
approach is probably still better).
That PEP link includes broken examples so it's really not worth much as a reference.
(for that matter, the Sequence
interface is also broken in Python, in case you need another historical example of why protocols are a bad idea).
chunks: [AtomicPtr>; 64],
appears before the explanation of why 64 works, and was confusing at first glance since this is completely different than the previous use of 64
, which was arbitrary. I was expecting a variable-size array of fixed-size arrays at first (using something like an rwlock you can copy/grow the internal vector without blocking - if there was a writer, the last reader of the old allocation frees it).
Instead of separate flags, what about a single (fixed-size, if chunks are) atomic bitset? This would increase contention slightly but that only happens briefly during growth, not accesses. Many architectures actually have dedicated atomic bit operations though sadly it's hard to get compilers to generate them.
The obvious API addition is for a single thread to push several elements at once, which can be done more efficiently.
Aside: Note that requests
is sloppy there, it should use either raise ... from e
to make the cause explicit, or from None
to hide it. Default propagation is supposed to imply that the second exception was unexpected.
In practice, Protocols are a way to make "superclasses" that you can never add features to (for example, readinto
despite being critical for performance is utterly broken in Python). This should normally be avoided at almost all costs, but for some reason people hate real base classes?
If you really want to do something like the original article, where there's a C-implemented class that you can't change, you're best off using a (named) Union
of two similar types, not a Protocol
.
I suppose they are useful for operator overloading but that's about it. But I'm not sure if type checkers actually implement that properly anyway; overloading is really nasty in a dynamically-typed language.
From my experience, Cinnamon is definitely highly immature compared to KDE. Very poor support for virtual desktops is the thing that jumped out at me most. There were also some problems regarding shortcuts and/or keyboard layout I think, and probably others, but I only played with it for a couple weeks while limited to LiveCD.