gh-104719: Restore Tokenize module constants#104722
Conversation
|
I was wondering if you could do something like this: _deprecated_objects = {
"group": lambda *choices: '(' + '|'.join(choices) + ')',
"any": lambda *choices: group(*choices) + '*',
"maybe": lambda *choices: group(*choices) + '?',
"Whitespace": r'[ \f\t]*',
# etc.
}And then at the end of the file have: def __getattr__(name):
if name in _deprecated_objects:
import warnings
warnings._deprecated(
f"tokenize.{name}",
message=(
"{name} is untested, undocumented, and deprecated. "
"It will be removed in Python {remove}."
),
remove=(3, 14)
)
obj = globals()[name] = _deprecated_objects[name]
return obj
raise AttributeError(f"module 'tokenize' has no attribute '{name}'")But I will defer to you and @pablogsal as to whether that's a good idea or not. |
|
Looks like this does fix the immediate issue, though -- IDLE is once again able to open |
For the time being, let's just scope this PR to restoring the constants and let's keep the discussion open in the main issue. Also, let's wait until @Yhg1s mentions what he would prefer. Among other things I can imagine an outcome when we promote these to public interface. |
SGTM. |
.pyfiles #104719