Unpythonic: Supercharge your Python with parts of Lisp and Haskell
4 months ago
6
In the spirit of toolz, we provide missing features for Python, mainly from the list processing tradition, but with some Haskellisms mixed in. We extend the language with a set of syntactic macros. We also provide an in-process, background REPL server for live inspection and hot-patching. The emphasis is on clear, pythonic syntax, making features work together, and obsessive correctness.
Some hypertext features of this README, such as local links to detailed documentation, and expandable example highlights, are not supported when viewed on PyPI; view on GitHub to have those work properly.
None required.
mcpyrate optional, to enable the syntactic macro layer, an interactive macro REPL, and some example dialects.
As of v0.15.3, unpythonic runs on CPython 3.8, 3.9 and 3.10, 3.11, 3.12, and PyPy3 (language versions 3.8, 3.9, 3.10); the CI process verifies the tests pass on those platforms. New Python versions are added and old ones are removed following the Long-term support roadmap.
This depends on the purpose of each feature, as well as ease-of-use considerations. See the design notes for more information.
Small, limited-space overview of the overall flavor. There is a lot more that does not fit here, especially in the pure-Python feature set. We give here simple examples that are not necessarily of the most general form supported by the constructs. See the full documentation and unit tests for more examples.
Or if you just want to take this for a test run, start the built-in demo app:
python3 -m unpythonic.net.server
Once a server is running, to connect:
python3 -m unpythonic.net.client 127.0.0.1
This gives you a REPL, inside your live process, with all the power of Python. You can importlib.reload any module, and through sys.modules, inspect or overwrite any name at the top level of any module. You can pickle.dump your data. Or do anything you want with/to the live state of your app.
You can have multiple REPL sessions connected simultaneously. When your app exits (for any reason), the server automatically shuts down, closing all connections if any remain. But exiting the client leaves the server running, so you can connect again later - that's the whole point.
Optionally, if you have mcpyrate, the REPL sessions support importing, invoking and defining macros.
We bind arguments to parameters like Python itself does, so it does not matter whether arguments are passed by position or by name during currying. We support @generic multiple-dispatch functions.
We also feature a Haskell-inspired passthrough system: any args and kwargs that are not accepted by the call signature will be passed through. This is useful when a curried function returns a new function, which is then the target for the passthrough. See the docs for details.
fromunpythonicimportgeneric@genericdefmy_range(stop: int): # create the generic function and the first multimethodreturnmy_range(0, 1, stop)
@genericdefmy_range(start: int, stop: int): # further registrations add more multimethodsreturnmy_range(start, 1, stop)
@genericdefmy_range(start: int, step: int, stop: int):
returnstart, step, stop
This is a purely run-time implementation, so it does not give performance benefits, but it can make code more readable, and makes it modular to add support for new input types (or different call signatures) to an existing function later.
importtypingfromunpythonicimportgeneric, augmentclassFunninessTrait:
passclassIsFunny(FunninessTrait):
passclassIsNotFunny(FunninessTrait):
pass@genericdeffunny(x: typing.Any): # defaultraiseNotImplementedError(f"`funny` trait not registered for anything matching {type(x)}")
@augment(funny)deffunny(x: str): # noqa: F811returnIsFunny()
@augment(funny)deffunny(x: int): # noqa: F811returnIsNotFunny()
@genericdeflaugh(x: typing.Any):
returnlaugh(funny(x), x)
@augment(laugh)deflaugh(traitvalue: IsFunny, x: typing.Any):
returnf"Ha ha ha, {x} is funny!"@augment(laugh)deflaugh(traitvalue: IsNotFunny, x: typing.Any):
returnf"{x} is not funny."assertlaugh("that") =="Ha ha ha, that is funny!"assertlaugh(42) =="42 is not funny."
Conditions: resumable, modular error handling, like in Common Lisp.
fromunpythonicimporterror, restarts, handlers, invoke, use_value, unboxclassMyError(ValueError):
def__init__(self, value): # We want to act on the value, so save it.self.value=valuedeflowlevel(lst):
_drop=object() # gensym/nonceout= []
forkinlst:
# Provide several different error recovery strategies.withrestarts(use_value=(lambdax: x),
halve=(lambdax: x//2),
drop=(lambda: _drop)) asresult:
ifk>9000:
error(MyError(k))
# This is reached when no error occurs.# `result` is a box, send k into it.result<<k# Now the result box contains either k,# or the return value of one of the restarts.r=unbox(result) # get the value from the boxifrisnot_drop:
out.append(r)
returnoutdefhighlevel():
# Choose which error recovery strategy to use...withhandlers((MyError, lambdac: use_value(c.value))):
assertlowlevel([17, 10000, 23, 42]) == [17, 10000, 23, 42]
# ...on a per-use-site basis...withhandlers((MyError, lambdac: invoke("halve", c.value))):
assertlowlevel([17, 10000, 23, 42]) == [17, 5000, 23, 42]
# ...without changing the low-level code.withhandlers((MyError, lambda: invoke("drop"))):
assertlowlevel([17, 10000, 23, 42]) == [17, 23, 42]
highlevel()
Conditions only shine in larger systems, with restarts set up at multiple levels of the call stack; this example is too small to demonstrate that. The single-level case here could be implemented as a error-handling mode parameter for the example's only low-level function.
With multiple levels, it becomes apparent that this mode parameter must be threaded through the API at each level, unless it is stored as a dynamic variable (see unpythonic.dyn). But then, there can be several types of errors, and the error-handling mode parameters - one for each error type - have to be shepherded in an intricate manner. A stack is needed, so that an inner level may temporarily override the handler for a particular error type...
The condition system is the clean, general solution to this problem. It automatically scopes handlers to their dynamic extent, and manages the handler stack automatically. In other words, it dynamically binds error-handling modes (for several types of errors, if desired) in a controlled, easily understood manner. The local programmability (i.e. the fact that a handler is not just a restart name, but an arbitrary function) is a bonus for additional flexibility.
If this sounds a lot like an exception system, that's because conditions are the supercharged sister of exceptions. The condition model cleanly separates mechanism from policy, while otherwise remaining similar to the exception model.
A gensym is a guaranteed-unique string, which is useful as a nonce value. It's similar to the pythonic idiom nonce = object(), but with a nice repr, and object-identity-preserving pickle support.
fromunpythonicimportsym# lispy symbolsandwich=sym("sandwich")
hamburger=sym("sandwich") # symbol's identity is determined by its name, onlyasserthamburgerissandwichassertstr(sandwich) =="sandwich"# symbols have a nice str()assertrepr(sandwich) =='sym("sandwich")'# and eval-able repr()asserteval(repr(sandwich)) issandwichfrompickleimportdumps, loadspickled_sandwich=dumps(sandwich)
unpickled_sandwich=loads(pickled_sandwich)
assertunpickled_sandwichissandwich# symbols survive a pickle roundtripfromunpythonicimportgensym# gensym: make new uninterned symboltabby=gensym("cat")
scottishfold=gensym("cat")
asserttabbyisnotscottishfoldpickled_tabby=dumps(tabby)
unpickled_tabby=loads(pickled_tabby)
assertunpickled_tabbyistabby# also gensyms survive a pickle roundtrip
fromunpythonicimportwithself, namelambdafact=withself(lambdaself, n: n*self(n-1) ifn>1else1) # see @trampolined to do this with TCOassertfact(5) ==120square=namelambda("square")(lambdax: x**2)
assertsquare.__name__=="square"assertsquare.__qualname__=="square"# or e.g. "somefunc.<locals>.square" if inside a functionassertsquare.__code__.co_name=="square"# used by stack traces
fromitertoolsimportcount, takewhilefromunpythonicimportmemoize, gmemoize, islicencalls=0@memoize# <-- important partdefsquare(x):
globalncallsncalls+=1returnx**2assertsquare(2) ==4assertncalls==1assertsquare(3) ==9assertncalls==2assertsquare(3) ==9assertncalls==2# called only once for each unique set of arguments# "memoize lambda": classic evaluate-at-most-once thunkthunk=memoize(lambda: print("hi from thunk"))
thunk() # the message is printed only the first timethunk()
@gmemoize# <-- important partdefprimes(): # FP sieve of Eratosthenesyield2fornincount(start=3, step=2):
ifnotany(n%p==0forpintakewhile(lambdax: x*x<=n, primes())):
yieldnasserttuple(islice(primes())[:10]) == (2, 3, 5, 7, 11, 13, 17, 19, 23, 29)
fromunpythonic.syntaximportmacros, test, test_raises, fail, error, warn, thefromunpythonic.test.fixturesimportsession, testset, terminate, returns_normallydeff():
raiseRuntimeError("argh!")
defg(a, b):
returna*bfail["this line should be unreachable"]
count=0defcounter():
globalcountcount+=1returncountwithsession("simple framework demo"):
withtestset():
test[2+2==4]
test_raises[RuntimeError, f()]
test[returns_normally(g(2, 3))]
test[g(2, 3) ==6]
# Use `the[]` (or several) in a `test[]` to declare what you want to inspect if the test fails.# Implicit `the[]`: in comparison, the LHS; otherwise the whole expression. Used if no explicit `the[]`.test[the[counter()] <the[counter()]]
withtestset("outer"):
withtestset("inner 1"):
test[g(6, 7) ==42]
withtestset("inner 2"):
test[NoneisNone]
withtestset("inner 3"): # an empty testset is considered 100% passed.passwithtestset("inner 4"):
warn["This testset not implemented yet"]
withtestset("integration"):
try:
importblarglyexceptImportError:
error["blargly not installed, cannot test integration with it."]
else:
... # blargly integration tests go herewithtestset(postproc=terminate):
test[2*2==5] # fails, terminating the nearest dynamically enclosing `with session`test[2*2==4] # not reached
We provide the low-level syntactic constructs test[], test_raises[] and test_signals[], with the usual meanings. The last one is for testing code that uses conditions and restarts; see unpythonic.conditions.
The test macros also come in block variants, with test, with test_raises, with test_signals.
As usual in test frameworks, the testing constructs behave somewhat like assert, with the difference that a failure or error will not abort the whole unit (unless explicitly asked to do so).
fromunpythonic.syntaximportmacros, let, letseq, letrecx=let[[a:=1, b:=2] ina+b]
y=letseq[[c:=1, # LET SEQuential, like Scheme's let*c:=2*c,
c:=2*c] inc]
z=letrec[[evenp:= (lambdax: (x==0) oroddp(x-1)), # LET mutually RECursive, like in Schemeoddp:= (lambdax: (x!=0) andevenp(x-1))]
inevenp(42)]
fromunpythonic.syntaximportmacros, dlet# In Python 3.8, use `@dlet(x << 0)` instead; in Python 3.9, use `@dlet(x := 0)`@dlet[x:=0] # let-over-lambda for Pythondefcount():
returnx:=x+1# `name := value` rebinds in the let envassertcount() ==1assertcount() ==2
fromunpythonic.syntaximportmacros, lazifywithlazify:
defmy_if(p, a, b):
ifp:
returna# b never evaluated in this code pathelse:
returnb# a never evaluated in this code pathassertmy_if(True, 23, 1/0) ==23assertmy_if(False, 1/0, 42) ==42
Also comes with automatically named, multi-expression lambdas.
fromunpythonic.dialectsimportdialects, Lispython# noqa: F401deffactorial(n):
deff(k, acc):
ifk==1:
returnaccf(k-1, k*acc)
f(n, acc=1)
assertfactorial(4) ==24factorial(5000) # no crashsquare=lambdax: x**2assertsquare(3) ==9assertsquare.__name__=="square"# - brackets denote a multiple-expression lambda body# (if you want to have one expression that is a literal list,# double the brackets: `lambda x: [[5 * x]]`)# - local[name := value] makes an expression-local variableg=lambdax: [local[y:=2*x],
y+1]
assertg(10) ==21
Pytkell: Automatic currying and implicitly lazy functions.
Clone the repo from GitHub. Then, navigate to it in a terminal, and:
pip install . --no-compile
If you intend to use the macro layer of unpythonic, the --no-compile flag is important. It prevents an incorrect precompilation, without macro support, that pip install would otherwise do at its bdist_wheel step.
For most Python projects such precompilation is just fine - it's just macro-enabled projects that shouldn't be precompiled with standard tools.
If --no-compile is NOT used, the precompiled bytecode cache may cause errors such as ImportError: cannot import name 'macros' from 'mcpyrate.quotes', when you try to e.g. from unpythonic.syntax import macros, let. In-tree, it might work, but against an installed copy, it will fail. It has happened that my CI setup did not detect this kind of failure.
This is a common issue when using macro expanders in Python.
Development mode (for developing unpythonic itself)
Starting with v0.15.5, unpythonic uses PDM to manage its dependencies. This allows easy installation of a development copy into an isolated venv (virtual environment), allowing you to break things without breaking anything else on your system (including apps and libraries that use an installed copy of unpythonic).
Install PDM in your Python environment
To develop unpythonic, if your Python environment does not have PDM, you will need to install it first:
python -m pip install pdm
Don't worry; it won't break pip, poetry, or other similar tools.
We will also need a Python for PDM venvs. This Python is independent of the Python that PDM itself runs on. It is the version of Python you would like to use for developing unpythonic.
For example, we can make Python 3.10 available with the command:
Specifying just a version number defaults to CPython (the usual Python implementation). If you want PyPy instead, you can use e.g. [email protected].
Install the isolated venv
Now, we will auto-create the development venv, and install unpythonic's dependencies into it. In a terminal that sees your Python environment, navigate to the unpythonic folder, and issue the command:
This creates the development venv into the .venv hidden subfolder of the unpythonic folder.
If you are a seasoned pythonista, note that there is no requirements.txt; the dependency list lives in pyproject.toml.
Upgrade dependencies (later)
To upgrade dependencies to latest available versions compatible with the specifications in pyproject.toml:
To activate the development venv, in a terminal that sees your Python environment, navigate to the unpythonic folder, and issue the command:
Note the Bash exec syntax $(...); the command pdm venv activate just prints the actual internal activation command.
Not working as advertised? Missing a feature? Documentation needs improvement?
While unpythonic is intended as a serious tool for improving productivity as well as for teaching, right now my work priorities mean that it's developed and maintained on whatever time I can spare for it. Thus getting a response may take a while, depending on which project I happen to be working on.
All original code is released under the 2-clause BSD license.
For sources and licenses of fragments originally seen on the internet, see AUTHORS.
Thanks to TUT for letting me teach RAK-19006 in spring term 2018; early versions of parts of this library were originally developed as teaching examples for that course. Thanks to @AgenttiX for early feedback.
Links to blog posts, online articles and papers on topics relevant in the context of unpythonic have been collected to a separate document.
If you like both FP and numerics, we have some examples based on various internet sources.