Skip to content

Comments

Fix: allow newer dukpy versions to be used#85

Open
peircej wants to merge 1 commit intometapensiero:masterfrom
peircej:master
Open

Fix: allow newer dukpy versions to be used#85
peircej wants to merge 1 commit intometapensiero:masterfrom
peircej:master

Conversation

@peircej
Copy link

@peircej peircej commented Jul 30, 2025

Newer Python installs don't have wheels for old dukpy (<0.2.3) and it needs compiling so they generally fail to install metapensiero. The issue that was causing metapensiero tests to fail was just a call to this.console = {log: print} which it seems we can live without. Just removing means we get back up and running

See https://gitlab.com/metapensiero/metapensiero.pj/-/issues/38

Newer Python installs don't have wheels for old dukpy (<0.2.3) and it needs compiling so they generally fail to install metapensiero. The issue that was causing metapensiero tests to fail was just a call to `this.console = {log: print}` which it seems we can live without. Just removing means we get back up and running

See https://gitlab.com/metapensiero/metapensiero.pj/-/issues/38
@peircej
Copy link
Author

peircej commented Jul 30, 2025

@azazel75 I submitted the same request to gitlab.com as I'm not sure which is your preferred repo. Would be great if you could pull this in - making it hard for my users to install metapensiero right now

@peircej
Copy link
Author

peircej commented Jul 30, 2025

You can see the test suite pass here

@azazel75
Copy link
Collaborator

Hi @peircej, I can publish a new release with your PR this evening or tomorrow

@azazel75
Copy link
Collaborator

I was a bit naive on this, I'm sorry... the tests passed just because the test suite used python versions that are now in EOL ("python37" "python38" "python39") I've updated the environment to use current python versions 3.10, 3.11, 3.12, 3.13... but the testing explodes... for one the library "Meta" which is a dependency has not been updated recently and this causes various test failures... I don't know still how to proceed: the choices are to open a PR to update the Meta's package, or move the code into metapensiero.pj and remove the dependency... in either case it's much more effort to fix the things...

@peircej
Copy link
Author

peircej commented Aug 29, 2025

Oh no! How annoying. I think I ran the tests on 3.10 (macos) and it was fine but I'll look again

@peircej
Copy link
Author

peircej commented Oct 10, 2025

I just tried using py3.10 on a local machine and the tests passed fine.
True that py3.11 failed one test due to a key error in meta

So could you maybe just limit to py3.10, which is not EOL but doesn't have the breaking changes of later versions?

For our project, I'll certainly make it a priority to move away from metapensiero though anticipating it not being further maintained. Thanks for your help along the way 😁 🙏

Test details

With a py3.10 venv, pytest gave me:

(metapensiero.pj):tests/ (master) $ uv pip install meta pytest .                                        
Using Python 3.10.17 environment at: /Users/lpzjwp/code/metapensiero.pj/.venv
Resolved 1 package in 227ms
Installed 1 package in 55ms
 + meta==1.0.2
(metapensiero.pj):tests/ (master) $ pytest                                                                                                          
========================================================================= test session starts =========================================================================
platform darwin -- Python 3.10.17, pytest-8.4.2, pluggy-1.6.0
rootdir: metapensiero.pj/tests
collected 103 items                                                                                                                                                   

test_ast.py s.s.s.s.s.ss.s.s.s.ss.ss...                                                                                                                         [ 26%]
test_evaljs.py .........X...................................                                                                                                    [ 69%]
test_various.py ...............................                                                                                                                 [100%]

============================================================== 88 passed, 14 skipped, 1 xpassed in 4.18s ==============================================================

With py3.11 venv I get

========================================================================= test session starts =========================================================================
platform darwin -- Python 3.11.12, pytest-8.4.2, pluggy-1.6.0
rootdir: metapensiero.pj/tests
collected 103 items                                                                                                                                                   

test_ast.py sFs.s.s.s.ss.s.s.s.ss.ss...                                                                                                                         [ 26%]
test_evaljs.py .........X...................................                                                                                                    [ 69%]
test_various.py ...............................                                                                                                                 [100%]

============================================================================== FAILURES ===============================================================================
_______________________________________________________________ TestASTFromFS.test_ast_dump[all_3.8.py] _______________________________________________________________

self = <test_ast.TestASTFromFS object at 0x1018982d0>, name = 'all_3.8.py'
py_code = <code object <module> at 0x1018f4440, file "/Users/lpzjwp/code/metapensiero.pj/tests/test_ast/test_ast_dump/all_3.8.py", line 1>
py_src = "def func():\n\n    __all__ = ['foo', 'bar']\n\n    __all__ = ('foo', 'bar')\n", options = {}
expected = "FunctionDef(args=arguments(args=[],\n                           defaults=[],\n                           kw_defaults=...,\n            decorator_list=[],\n            name='func',\n            returns=None,\n            type_comment=None)"
astdump = <function ast_dump_object at 0x101626d40>

    def test_ast_dump(self, name, py_code, py_src, options, expected, astdump):
>       node, dump = astdump(py_code, **options)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^

test_ast.py:15: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../.venv/lib/python3.11/site-packages/metapensiero/pj/testing.py:29: in ast_dump_object
    from meta.asttools import str_ast
../.venv/lib/python3.11/site-packages/meta/__init__.py:2: in <module>
    from meta.decompiler.instructions import make_module
../.venv/lib/python3.11/site-packages/meta/decompiler/__init__.py:7: in <module>
    from meta.decompiler.instructions import make_module, make_function
../.venv/lib/python3.11/site-packages/meta/decompiler/instructions.py:9: in <module>
    from meta.decompiler.control_flow_instructions import CtrlFlowInstructions
../.venv/lib/python3.11/site-packages/meta/decompiler/control_flow_instructions.py:15: in <module>
    JUMP_OPS = [opcode.opmap[name] for name in JUMPS]
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

.0 = <list_iterator object at 0x101b9d030>

>   JUMP_OPS = [opcode.opmap[name] for name in JUMPS]
                ^^^^^^^^^^^^^^^^^^
E   KeyError: 'POP_JUMP_IF_FALSE'

../.venv/lib/python3.11/site-packages/meta/decompiler/control_flow_instructions.py:15: KeyError
======================================================================= short test summary info =======================================================================
FAILED test_ast.py::TestASTFromFS::test_ast_dump[all_3.8.py] - KeyError: 'POP_JUMP_IF_FALSE'
========================================================= 1 failed, 87 passed, 14 skipped, 1 xpassed in 3.75s =========================================================

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants