Version 0.15.0
0.15.0 (22 June 2021) - "We say 'howdy' around these parts" edition:
Beside introducing dialects (a.k.a. whole-module code transforms), this edition concentrates on upgrading our dependencies, namely the macro expander, and the Python language itself, to ensure unpythonic keeps working for the next few years. This introduces some breaking changes, so we have also taken the opportunity to apply any such that were previously scheduled.
We have sneaked in some upgrades for other subsystems, too. Particularly curry, the multiple dispatch system (@generic), and the integration between these two have been improved significantly.
IMPORTANT:
- Minimum Python language version is now 3.6.
- We support 3.6, 3.7, 3.8, 3.9 and PyPy3 (language versions 3.6 and 3.7).
- For future plans, see our Python language version support status.
- The optional macro expander is now
mcpyrate.
If you still need unpythonic for Python 3.4 or 3.5, use version 0.14.3, which is the final version of unpythonic that supports those language versions.
The same applies if you need the macro parts of unpythonic (i.e. import anything from unpythonic.syntax) in your own project that uses MacroPy. Version 0.14.3 of unpythonic works up to Python 3.7.
New:
-
Dialects! New module
unpythonic.dialects, providing some example dialects that demonstrate what can be done with a dialects system (i.e. full-module code transformer) together with a kitchen-sink language extension macro package such asunpythonic. -
Improved robustness: several auxiliary syntactic constructs now detect at macro expansion time if they appear outside any valid lexical context, and raise
SyntaxError(with a descriptive message) if so.- The full list is:
call_cc[], forwith continuationsit, foraif[]local[]/delete[], fordo[]q/u/kw, forwith prefixwhere, forlet[body, where(k0=v0, ...)](also forletseq,letrec,let_syntax,abbrev)with expr/with block, forwith let_syntax/with abbrev
- Previously these constructs could only raise an error at run time, and not all of them could detect the error even then.
- The full list is:
-
Syntactic consistency: allow env-assignment notation and brackets to declare bindings in the
letfamily of macros. The preferred syntaxes for theletmacro are now:let[x << 42, y << 9001][...] # lispy expr let[[x << 42, y << 9001] in ...] # haskelly let-in let[..., where[x << 42, y << 9001]] # haskelly let-where
If there is just one binding, these become:
let[x << 42][...] let[[x << 42] in ...] let[..., where[x << 42]]
Similarly for
letseq,letrec, and the decorator versions; and for the expr forms oflet_syntax,abbrev. The reason for preferring this notation is that it is consistent with bothunpythonic's env-assignments (letbindings live in anenv) and the use of brackets to denote macro invocations.To ease backwards compatibility, we still accept the syntax used up to v0.14.3, too.
Also, from symmetry and usability viewpoints, if a mix of brackets and parentheses are used, it hardly makes sense to require some specific mix - so this has been extended so that the choice of delimiter doesn't matter. All the following are also accepted, with the meaning exactly the same as above:
let[[x, 42], [y, 9001]][...] # best visual consistency let[(x, 42), (y, 9001)][...] let([x, 42], [y, 9001])[...] let((x, 42), (y, 9001))[...] # like up to v0.14.3 let[[[x, 42], [y, 9001]] in ...] # best visual consistency let[[(x, 42), (y, 9001)] in ...] let[([x, 42], [y, 9001]) in ...] let[((x, 42), (y, 9001)) in ...] # like up to v0.14.3 let[(x << 42, y << 9001) in ...] let[..., where[[x, 42], [y, 9001]]] # best visual consistency let[..., where[(x, 42), (y, 9001)]] let[..., where([x, 42], [y, 9001])] let[..., where((x, 42), (y, 9001))] # like up to v0.14.3 let[..., where(x << 42, y << 9001)]
For a single binding, these are also accepted:
let[x, 42][...] let(x, 42)[...] # like up to v0.14.3 let[[x, 42] in ...] let[(x, 42) in ...] # like up to v0.14.3 let[(x << 42) in ...] let[..., where[x, 42]] let[..., where(x, 42)] # like up to v0.14.3 let[..., where(x << 42)]
These alternate syntaxes will be supported at least as long as we accept parentheses to pass macro arguments; but in new code, please use the preferred syntaxes.
-
Miscellaneous.
with namedlambdanow understands the walrus operator, too. In the constructf := lambda ...: ..., the lambda will get the namef. (Python 3.8 and later.)with namedlambdanow auto-names lambdas that don't have a name candidate using their source location info, if present. This makes it easy to see in a stack trace where some particular lambda was defined.- Multiple-dispatch system
unpythonic.dispatch:- Use consistent terminology:
- The function that supports multiple call signatures is a generic function.
- Its individual implementations are multimethods.
- Add decorator
@augment: add a multimethod to a generic function defined elsewhere. - Add function
isgenericto detect whether a callable has been declared@generic. - Add function
methods: display a list of multimethods of a generic function. - It is now possible to dispatch on a homogeneous type of contents collected by a
**kwargsparameter. currynow supports@genericfunctions. This feature is experimental. Semantics may still change.- The utilities
arities,required_kwargs, andoptional_kwargsnow support@genericfunctions. This feature is experimental. Semantics may still change.
- Use consistent terminology:
currynow errors out immediately on argument type mismatch.- Add
partial, a type-checking wrapper forfunctools.partial, that errors out immediately on argument type mismatch. - Add
unpythonic.excutil.reraise_in(expr form),unpythonic.excutil.reraise(block form): conveniently remap library exception types to application exception types. Idea from Alexis King (2016): Four months with Haskell. - Add variants of the above for the conditions-and-restarts system:
unpythonic.conditions.resignal_in,unpythonic.conditions.resignal. The new signal is sent using the same error-handling protocol as the original signal, so that e.g. anerrorremains anerroreven if re-signaling changes its type. - Add
resolve_bindings_partial, useful for analyzing partial application. - Add
triangular, to generate the triangular numbers (1, 3, 6, 10, ...). - Add
partition_int_triangularto answer a timeless question concerning stackable plushies. - Add
partition_int_customto answer unanticipated similar questions. - All documentation files now have a quick navigation section to skip to another part of the docs. (For all except the README, it's at the top.)
- Python 3.8 and 3.9 support added.
Non-breaking changes:
-
Changes to how some macros expand.
-
Some macros, notably
letseq,do0, andlazyrec, now expand into hygienic macro captures of other macros. Thecontinuationsmacro also outputs a hygienically capturedaifwhen transforming anorexpression that occurs in tail position.- This allows
mcpyrate.debug.step_expansionto show the intermediate result, as well as brings the implementation closer to the natural explanation of how these macros are defined. (Zen of Python: if the implementation is easy to explain, it might be a good idea.) - The implicit do (extra bracket syntax) also expands as a hygienically captured
do, but e.g. inlet[]it will then expand immediately (due tolet's inside-out expansion order) before control returns to the macro stepper. If you want to see the implicitdo[]invocation, use the"detailed"mode of the stepper, which shows individual macro invocations even when expanding inside-out:step_expansion["detailed"][...],with step_expansion["detailed"]:.
- This allows
-
The
do[]anddo0[]macros now expand outside-in. The main differences from a user perspective are:- Any source code captures (such as those performed by
test[]) show the expanded output ofdoanddo0, because that's what they receive. (For tests, you may want to use the macrowith expand_testing_macros_first, which see.) mcpyrate.debug.step_expansionis able to show the intermediate result after thedoordo0has expanded, but before anything else has been done to the tree.
- Any source code captures (such as those performed by
-
-
Miscellaneous.
- Resolve issue #61:
currynow supports kwargs properly.- We now analyze parameter bindings like Python itself does, so it should no longer matter whether arguments are passed by position or by name.
- Positional passthrough works as before. Named passthrough added.
- Any remaining arguments (that cannot be accepted by the initial call) are passed through to a callable intermediate result (if any), and then outward on the curry context stack as a
Values. Sincecurryin this role is essentially a function-composition utility, the receiving curried function instance unpacks theValuesinto args and kwargs. - If any extra arguments (positional or named) remain when the top-level curry context exits, then by default,
TypeErroris raised. To override, usewith dyn.let(curry_context=["whatever"]), just like before. Then you'll get aValuesobject.
- The generator instances created by the gfuncs returned by
gmemoize,imemoize, andfimemoize, now support the__len__and__getitem__methods to access the already-yielded, memoized part. Asking for thelenreturns the current length of the memo. For subscripting, both a singleintindex and a slice are accepted. Note that memoized generators do not support all of thecollections.abc.SequenceAPI, because e.g.__contains__and__reversed__are missing, on purpose. fup/fupdate/ShadowedSequencecan now walk the start of a memoized infinite replacement backwards. (Useimemoizeon the original iterable, instantiate the generator, and use that generator instance as the replacement.)- When using the
autoreturnmacro, if the item in tail position is a function definition or class definition, return the thing that was defined. - The
nbmacro now works together withautoreturn. unpythonic.conditions.signal, when the signal goes unhandled, now returns the canonized inputcondition, with a nice traceback attached. This feature is intended for implementing custom error protocols on top ofsignal;erroralready uses it to produce a nice-looking error report.- The internal exception types
unpythonic.conditions.InvokeRestartandunpythonic.ec.Escapenow inherit fromBaseException, so that they are not inadvertently caught byexcept Exceptionhandlers. - The modules
unpythonic.dispatchandunpythonic.typecheck, which provide the@genericand@typeddecorators and theisoftypefunction, are no longer considered experimental. From this release on, they receive the same semantic versioning guarantees as the rest ofunpythonic. - CI: Automated tests now run on Python 3.6, 3.7, 3.8, 3.9, and PyPy3 (language versions 3.6, 3.7).
- CI: Test coverage improved to 94%.
- Full update pass for the user manual written in Markdown.
- Things added or changed in 0.14.2 and later are still mentioned as such, and have not necessarily been folded into the main text. But everything should be at least up to date now.
- Resolve issue #61:
Breaking changes:
-
New macro expander
mcpyrate; MacroPy support dropped.- API differences.
- Macro arguments are now passed using brackets,
macroname[args][...],with macroname[args],@macroname[args], instead of parentheses.- Parentheses are still available as alternative syntax, because up to Python 3.8, decorators cannot have subscripts (so e.g.
@dlet[(x, 42)]is a syntax error, but@dlet((x, 42))is fine). This has been fixed in Python 3.9. - If you already only run on Python 3.9 and later, please use brackets, that is the preferred syntax. We currently plan to eventually drop support for parentheses to pass macro arguments in the future, when Python 3.9 becomes the minimum supported language version for
unpythonic.
- Parentheses are still available as alternative syntax, because up to Python 3.8, decorators cannot have subscripts (so e.g.
- If you write your own macros, note
mcpyrateis not drop-in compatible with MacroPy ormcpy. See its documentation for details.
- Macro arguments are now passed using brackets,
- Behavior differences.
mcpyrateshould report test coverage for macro-using code correctly; no need for# pragma: no coverin block macro invocations or in quasiquoted code.
- API differences.
-
Previously scheduled API changes.
- As promised, names deprecated during 0.14.x have been removed. Old name on the left, new name on the right:
m→imathify(consistency with the rest ofunpythonic)mg→gmathify(consistency with the rest ofunpythonic)setescape→catch(Lisp family standard name)escape→throw(Lisp family standard name)getvalue,runpipe→exitpipe(combined into one)- CAUTION:
exitpipealready existed in v0.14.3, but beginning with v0.15.0, it is now anunpythonic.symbol.sym(like a Lisp symbol). This is not compatible with existing, pickledexitpipeinstances; it used to be an instance of the classGetvalue, which has been removed. (There's not much reason to pickle anexitpipeinstance, but we're mentioning this for the sake of completeness.)
- CAUTION:
- Drop support for deprecated argument format for
raisef. Now the usage israisef(exc)orraisef(exc, cause=...). These correspond exactly toraise excandraise exc from ..., respectively.
- As promised, names deprecated during 0.14.x have been removed. Old name on the left, new name on the right:
-
Other backward-incompatible API changes.
- Multiple-return-value handling changed. Resolves issue #32.
- Multiple return values are now denoted as
Values, available from the top-level namespace ofunpythonic. - The
Valuesconstructor accepts both positional and named arguments. Passing in named arguments creates named return values. This completes the symmetry between argument passing and returns. - Most of the time, it's still fine to return a tuple and destructure that; but in contexts where it is important to distinguish between a single
tuplereturn value and multiple return values, it is preferable to useValues. - In any utilities that deal with function composition, if your intent is multiple-return-values, it is now mandatory to return a
Valuesinstead of atuple:currypipefamilycomposefamilyunfolditerate- All multiple-return-values in code using the
with continuationsmacro. (The continuations system essentially composes continuation functions.)
- Multiple return values are now denoted as
- The lazy evaluation tools
lazy,Lazy, and the quick lambdaf(underscore notation for Python) are now provided byunpythonicasunpythonic.syntax.lazy,unpythonic.lazyutil.Lazy, andunpythonic.syntax.fn(note name change!), because they used to be provided bymacropy, andmcpyratedoes not provide them.- API differences.
- The quick lambda is now named
fn[]instead off[](as in MacroPy). This was changed becausefis often used as a function name in code examples, local temporaries, and similar. Also,fn[]is a less ambiguous abbreviation for a syntactic construct that means function, while remaining shorter than the equivalentlambda. Comparefn[_ * 2]andlambda x: x * 2, orfn[_ * _]andlambda x, y: x * y.- Note that in
mcpyrate, macros can be as-imported, so this change affects just the default name offn[]. But that is exactly what is important: have a sensible default name, to remove the need to as-import so often.
- Note that in
- The macros
lazyandfncan be imported from the syntax interface module,unpythonic.syntax, and the classLazyis available at the top level ofunpythonic. - Unlike
macropy'sLazy, ourLazydoes not define__call__; instead, it defines the methodforce, which has the same effect (it computes if necessary, and then returns the value of the promise). You can also use the functionunpythonic.force, which has the extra advantage that it passes through a non-promise input unchanged (so you don't need to care whetherxis a promise before callingforce(x); this is sometimes useful). - When you import the macro
quicklambda, you must import also the macrofn. - The underscore
_is no longer a macro on its own. Thefnmacro treats the underscore magically, as before, but anywhere else it is available to be used as a regular variable.
- The quick lambda is now named
- Behavior differences.
fn[]now respects nesting: an invocation offn[]will not descend into another nestedfn[].- The
with quicklambdamacro is still provided, and used just as before. Now it causes anyfn[]invocations lexically inside the block to expand before any other macros in that block do. - Since in
mcpyrate, macros can be as-imported, you can renamefnat import time to have any name you want. Thequicklambdablock macro respects the as-import, by internally querying the expander to determine the name(s) the macrofnis currently bound to.
- API differences.
- For the benefit of code using the
with lazifymacro, laziness is now better respected by thecomposefamily,andfandorf. The utilities themselves are marked lazy, and arguments will be forced only when a lazy function in the chain actually uses them, or when an eager (not lazy) function is encountered in the chain. - Rename the
currymacro toautocurry, to prevent name shadowing of thecurryfunction. The new name is also more descriptive. - Move the functions
force1andforcefromunpythonic.syntaxtounpythonic. Make theLazyclass (promise implementation) public. (They actually come fromunpythonic.lazyutil.) - Change parameter ordering of
unpythonic.it.windowto make it curry-friendly. Usage is nowwindow(n, iterable).- This was an oversight when this function was added; most other functions in
unpythonic.ithave been curry-friendly from the beginning.
- This was an oversight when this function was added; most other functions in
- Change output format of
resolve_bindingsto return aninspect.BoundArgumentsinstead of the previousOrderedDictthat had a custom format. Change the input format oftuplify_bindingsto match. - Change parameter name from
ltolengthin the functionsin_sliceandindex_in_slice(in theunpythonic.collectionsmodule).- These are mostly used internally, but technically a part of the public API.
- This change fixes a
flake8E741 warning, and the new name for the parameter is more descriptive.
- Multiple-return-value handling changed. Resolves issue #32.
-
Miscellaneous.
- Robustness: the
with continuationsmacro now raisesSyntaxErrorif async constructs (async deforawait) appear lexically inside the block, because interaction ofwith continuationswith Python's async subsystem has never been implemented. See issue #4. - The functions
raisef,tryf,equip_with_traceback, andasync_raisenow live inunpythonic.excutil. They are still available in the top-level namespace ofunpythonic, as usual. - The functions
callandcallwithnow live inunpythonic.funutil. They are still available in the top-level namespace ofunpythonic, as usual. - The functions
almosteq,fixpoint,partition_int, andulpnow live inunpythonic.numutil. They are still available in the top-level namespace ofunpythonic, as usual. - Remove the internal utility class
unpythonic.syntax.util.ASTMarker. We now havemcpyrate.markers.ASTMarker, which is designed for data-driven communication between macros that work together. As a bonus, no markers are left in the AST at run time. - Rename contribution guidelines to
CONTRIBUTING.md, which is the modern standard name. Old name wasHACKING.md, which was correct, but nowadays obscure. - Python 3.4 and 3.5 support dropped, as these language versions have officially reached end-of-life.
- Robustness: the
Fixed:
-
Make
unpythonic.misc.callsite_filenameignore our call helpers. This allows the testing framework report the source code filename correctly when testing code using macros that make use of these helpers (e.g.autocurry,lazify). -
In
aif,itis now only valid in thethenandotherwiseparts, as it should always have been. -
Fix docstring of
test: multiplethe[]marks were already supported in 0.14.3, as the macro documentation already said, but the docstring claimed otherwise. -
Fix bug in
with namedlambda. Due to incorrect function arguments in the analyzer, already named lambdas were not detected correctly. -
Fix bug:
fup/fupdate/ShadowedSequencenow actually accept an infinite-length iterable as a replacement sequence (under the obvious usage limitations), as the documentation has always claimed. -
Fix bug:
memoizeis now thread-safe. Even when the same memoized function instance is called concurrently from multiple threads. Exactly one thread will compute the result. Iffis recursive, the thread that acquired the lock is the one that is allowed to recurse into the memoizedf.