from Hacker News

PEP 810 – Explicit lazy imports

by azhenley on 10/3/25, 6:24 PM with 240 comments

  • by simonw on 10/3/25, 7:54 PM

    Love this. My https://llm.datasette.io/ CLI tool supports plugins, and people were complaining about really slow start times even for commands like "llm --help" - it turned out there were popular plugins that did things like import pytorch at the base level, so the entire startup was blocked on heavy imports.

    I ended up adding a note to the plugin author docs suggesting lazy loading inside of functions - https://llm.datasette.io/en/stable/plugins/advanced-model-pl... - but having a core Python language feature for this would be really nice.

  • by charliermarsh on 10/3/25, 7:35 PM

    Lazy imports have been proposed before, and were rejected most recently back in 2022: https://discuss.python.org/t/pep-690-lazy-imports-again/1966.... If I recall correctly, lazy imports are a feature supported in Cinder, Meta's version of CPython, and the PEP was driven by folks that worked on Cinder. Last time, a lot of the discussion centered around questions like: Should this be opt-in or opt-out? At what level? Should it be a build-flag for CPython itself? Etc. The linked post suggests that the Steering Council ultimately rejected it because of the complexity it would introduce to have two divergent "modes" of importing.

    I hope this proposal succeeds. I would love to use this feature.

  • by comex on 10/3/25, 8:53 PM

    I don't hate it but I don't love it. It sounds like everyone will start writing `lazy` before essentially every single import, with rare exceptions where eager importing is actually needed. That makes Python code visually noisier. And with no plan to ever change the default, the noise will stay forever.

    I would have preferred a system where modules opt in to being lazy-loaded, with no extra syntax on the import side. That would simplify things since only large libraries would have to care about laziness. To be fair, in such a design, the interpreter would have to eagerly look up imports on the filesystem to decide whether they should be lazy-loaded. And there are probably other downsides I'm not thinking of.

  • by wrmsr on 10/3/25, 7:23 PM

    If anyone's interested I've implemented a fairly user friendly lazy import mechanism in the form of context managers (auto_proxy_import/init) at https://pypi.org/project/lazyimp/ that I use fairly heavily. Syntactically it's just wrapping otherwise unmodified import statements in a with block, so tools 'just work' and it can be easily disabled or told to import eagerly for debugging. It's powered primarily by swapping out the frame's f_builtins in a cext (as it needs more power than importlib hooks provide), but has a lame attempt at a threadsafe pure python version, and a super dumb global hook version.

    I was skeptical and cautious with it at first but I've since moved large chunks of my codebase to it - it's caused surprisingly few problems (honestly none besides forgetting to handle some import-time registration in some modules) and the speed boost is addictive.

  • by Spivak on 10/3/25, 7:46 PM

    I love the feature but I really dislike using the word lazy as a new language keyword. It just feels off somehow. I think maybe defer might be a better word. It is at least keeps the grammar right because it would be lazily.

        lazily import package.foo
        vs
        defer import package.foo
    
    Also the grammar is super weird for from imports.

        lazy from package import foo
        vs.
        from package defer import foo.
  • by thayne on 10/3/25, 9:56 PM

    One thing the PEP doesn't really talk about, and that I find very annoying is that many python linters will complain if you don't put all of your imports at the top of the file, so you get lint warnings if you do the most obvious way to implement lazy imports.

    And that is actually a problem for more than just performance. In some cases, importing at the top might actually just fail. For example if you need a platform specific library, but only if it is running on that platform.

  • by nilslindemann on 10/3/25, 10:56 PM

    This is the wrong syntax, comparable to how "u" strings were the wrong syntax and "b" strings are the right syntax.

    They make that, what should be the default, a special case. Soon, every new code will use "lazy". The long term effect of such changes is a verbose language syntax.

    They should have had a period where one, if they want lazy imports, has to do "from __future__ import lazy_import". After that period, lazy imports become the default. For old-style immediate imports, introduce a syntax: "import now foo" and "from foo import now bar".

    All which authors of old code would have to do is run a provided fix script in the root directory of their code.

  • by zahlman on 10/3/25, 8:34 PM

    > The standard library provides the LazyLoader class to solve some of these inefficiency problems. It permits imports at the module level to work mostly like inline imports do.

    The use of these sorts of Python import internals is highly non-obvious. The Stack Overflow Q&A I found about it (https://stackoverflow.com/questions/42703908/) doesn't result in an especially nice-looking UX.

    So here's a proof of concept in existing Python for getting all imports to be lazy automatically, with no special syntax for the caller:

      import sys
      import threading # needed for python 3.13, at least at the REPL, because reasons
      from importlib.util import LazyLoader # this has to be eagerly imported!
      class LazyPathFinder(sys.meta_path[-1]): # <class '_frozen_importlib_external.PathFinder'>
          @classmethod
          def find_spec(cls, fullname, path=None, target=None):
              base = super().find_spec(fullname, path, target)
              base.loader = LazyLoader(base.loader)
              return base
      sys.meta_path[-1] = LazyPathFinder
    
    We've replaced the "meta path finder" (which implements the logic "when the module isn't in sys.modules, look on sys.path for source code and/or bytecode, including bytecode in __pycache__ subfolders, and create a 'spec' for it") with our own wrapper. The "loader" attached to the resulting spec is replaced with an importlib.util.LazyLoader instance, which wraps the base PathFinder's provided loader. When an import statement actually imports the module, the name will actually get bound to a <class 'importlib.util._LazyModule'> instance, rather than an ordinary module. Attempting to access any attribute of this instance will trigger the normal module loading procedure — which even replaces the global name.

    Now we can do:

      import this # nothing shows up
      print(type(this)) # <class 'importlib.util._LazyModule'>
      rot13 = this.s # the module is loaded, printing the Zen
      print(type(this)) # <class 'module'>
    
    That said, I don't know what the PEP means by "mostly" here.
  • by TheMrZZ on 10/3/25, 7:05 PM

    Feels like a good feature, with a simple explanation, real world use cases, and a scoped solution (global only, pretty simple keyword). I like it!
  • by bcoates on 10/4/25, 12:59 AM

    I think they're understating the thread safety risks here. The import is going to wind up happening at a random nondeterministic time, in who knows what thread holding who knows what locks (aside from the importer lock).

    Previously, if you had some thread hazardous code at module import time, it was highly likely to only run during the single threaded process startup phase, so it was likely harmless. Lazy loading is going to unearth these errors in the most inconvenient way (as Heisenbugs)

    (Function level import can trigger this as well, but the top of a function is at least a slightly more deterministic place for imports to happen, and an explicit line of syntax triggering the import/bug)

  • by aftbit on 10/3/25, 7:57 PM

    We tend to prefer explicit top-level imports specifically because they reveal dependency problems as soon as the program starts, rather than potentially hours or days later when a specific code path is executed.
  • by f311a on 10/3/25, 7:43 PM

    I don't like the idea of introducing a new keyword. We need a backward compatible solution. I feel like Python needs some kind of universal annotation syntax such as in go (comments) or in Rust (macros). New keyword means all parsers, lsps, editors should be updated.

    I’m pretty sure there will be new keywords in Python in the future that only solve one thing.

  • by jacquesm on 10/4/25, 12:37 AM

    Lazy imports are a great way to create runtime errors far into the operation of a long lived service. Yes, it gives the superficial benefit of 'fast startup', but that upside is negated by the downside of not being sure that once something runs it will run to completion due to a failed import much further down the line. It also allows for some interesting edge cases with the items that are going to be imported no longer being what is on the tin at the time the program is started.
  • by film42 on 10/3/25, 7:23 PM

    I'm a fan because it's something you can explicitly turn on and off. For my Docker based app, I really want to verify the completeness of imports. Preferably, at build and test time. In fact, most of the time I will likely disable lazy loading outright. But, I would really appreciate a faster loading CLI tool.

    However, there is a pattern in python to raise an error if, say, pandas doesn't have an excel library installed, which is fine. In the future, will maintainers opt to include a bunch of unused libraries since they won't negatively impact startup time? (Think pandas including 3-4 excel parsers by default, since it will only be loaded when called). It's a much better UX, but, now if you opt out of lazy loading, your code will take longer to load than without it.

  • by jmward01 on 10/3/25, 8:19 PM

    This is needed, but I don't like new keywords. What I would love, for many reasons, is if we could decorate statements. Then things like:

    import expensive_module

    could be:

    @lazy

    import expensive_module

    or you could do:

    @retry(3)

    x = failure_prone_call(y)

    lazy is needed, but maybe there is a more basic change that could give more power with more organic syntax, and not create a new keyword that is special purpose (and extending an already special purpose keyword)

  • by ajb on 10/4/25, 7:22 AM

    Given all the problems people are mentioning, it seems like this proposal is on the wrong side. There should be an easy way for a module to declare itself to be lazy loaded. The module author, not the user, is the one who knows whether lazy loading will break stuff.
  • by pjjpo on 10/4/25, 9:31 AM

    I wonder if this proposal suffers an because of Python's extremely generous support period and perhaps the ship has sailed.

    - lazy imports are a hugely impactful feature

    - lazy imports are already possible without syntax

    This means any libraries that get large benefit from lazy imports already use import statements within functions. They can't really use the new feature since 3.14 EoL is _2030_, forever from now. The __lazy_modules__ syntax preserves compatibility only but being eager, not performance - libraries that need lazy imports can't use it until 2030.

    This means that the primary target for a long time is CLI authors, which can have a more strict python target and is mentioned many times in the PEP, and libraries that don't have broad Python version support (meaning not just works but works well), which indicates they are probably not major libraries.

    Unless the feature gets backported to 2+ versions, it feels not so compelling. But given how it modifies the interpreter to a reasonable degree, I wonder if even any backport is on the table.

  • by Boxxed on 10/3/25, 8:27 PM

    Ugh...I like the idea, but I wish lazy imports were the default. Python allows side effects in the top level though so that would be a breaking change.

    Soooo instead now we're going to be in a situation where you're going to be writing "lazy import ..." 99% of the time: unless you're a barbarian, you basically never have side effects at the top level.

  • by f33d5173 on 10/4/25, 7:06 PM

    It would be interesting if instead you added a syntax whereby a module could declare that it supported lazy importing. Maybe even after running some code with side effects that couldn't be done lazily. For one thing, this would have a much broader performance impact, since it would benefit all users of the library, not just those who explicitly tagged their imports as lazy. For another, it would minimize breakage, since a module author knows best whether, and which parts of, their module can be lazily loaded.

    On the other hand, it would create confusion for users of a library when the performance hit of importing a library was delayed to the site of usage. They might not expect, for example, a lag to occur there. I don't think it would cause outright breakage, but people might not like the way it behaved.

  • by the_mitsuhiko on 10/3/25, 7:48 PM

    I like the approach of ES6 where you pull in bindings that are generally lazily resolved. That is IMO the approach that should be the general strategy for Python.
  • by forrestthewoods on 10/3/25, 7:36 PM

    I don’t want lazy imports. That’s just makes performance shitty later and harder to debug. It’s a hacky workaround.

    What I want is for imports to not suck and be slow. I’ve had projects where it was faster to compile and run C++ than launch and start a Python CLI. It’s so bad.

  • by dataangel on 10/3/25, 11:02 PM

    This would be a huge deal for Python startup time *if* it was applied to all the standard library packages recursively. Right now importing asyncio brings in half the standard library through transitive imports.
  • by Too on 10/4/25, 6:46 AM

    This is a great compromise given how much would break if this was the default. Making this the default would be better in the long-run, require taming the abuse of side-effects in imports too, win-win.

    If one could dream, modules should have to explicitly declare whether they have side-effects or not, with a marker at the top of the module. Not declaring this and trying anything except declaring a function or class should lead to type-checker errors. Modules declaring this pure-marker then automatically become lazy. Others should require explicit "import_with_side_effects" keyword.

        __pure__ = True
        import logging 
        import threading
      
        app = Flask()          # << ImpureError
        sys.path.append()      # << ImpureError
        with open(.. )         # << ImpureError
        logging.basicSetup()   # << ImpureError
        if foo:                # << ImpureError  (arguable)
       
        @app.route("/foo")     # Questionable
        def serve():           # OK
            ...
    
        serve()                # << ImpureError
        t = threading.Thread(target=serve) # << ImpureError
    
    
    All of this would be impossible today, given how much the Python relies on metaprogramming. Even standard library exposes functions to create classes on the fly like Enum and dataclasses, that are difficult to assert as either pure or impure. With more and more of the ecosystem embracing typed Python, this metaprogramming is reducing into a less dynamic subset. Type checkers and LSPs must have at least some awareness of these modules, without executing them as plain python-code.
  • by rplnt on 10/3/25, 8:11 PM

    Remember mercurial? Me neither. But what I remeber is this article I've read about all the hacks they had to do to achieve reasonable startup time for CLI in python. And the no #1 cause was loading the whole world you don't ever need. As I recall they somehow monkeypatched the interpreter to ignore imports and just remember their existence until they were actually needed, at which point the import happened. So all the dead paths were just skipped.
  • by wmichelin on 10/4/25, 9:37 PM

    This seems unnecessary if we can run imports from functions, right? This feels like a layer of indirection and a potential source for confusion. Rather than implicitly importing a library when the variable is first used, why don't you just explicitly do it?

    Edit

    > A somewhat common way to delay imports is to move the imports into functions (inline imports), but this practice requires more work to implement and maintain, and can be subverted by a single inadvertent top-level import. Additionally, it obfuscates the full set of dependencies for a module. Analysis of the Python standard library shows that approximately 17% of all imports outside tests (nearly 3500 total imports across 730 files) are already placed inside functions or methods specifically to defer their execution. This demonstrates that developers are already manually implementing lazy imports in performance-sensitive code, but doing so requires scattering imports throughout the codebase and makes the full dependency graph harder to understand at a glance.

    I think this is a really weak foundation for this language feature. We don't need it.

  • by croemer on 10/4/25, 9:29 PM

    The link target should be https://peps.python.org/pep-0810/ not a preview build for a PR. @dang
  • by sedatk on 10/3/25, 7:06 PM

    I wonder how much things would break if all imports were lazy by default.
  • by romanows on 10/3/25, 11:47 PM

    Kinda related, I wish there was an easy way to exclude dependencies at pip-install time and mock them at runtime so an import doesn't cause an exception. Basically a way for me to approximate "extras" when the author isn't motivated to do it for me, even though it'd be super brittle.
  • by Galanwe on 10/3/25, 7:10 PM

    That seems really great, .moving imports in-line for CLI tools startup and test discovery has always been a pain.
  • by throwaway81523 on 10/4/25, 12:27 AM

    We need some kind of unexec revival in Linux so we don't have to resort to crap like this. Maybe CRIU based. Worst case, some Python specific hack. But Python's import system is wacky enough without lazy loading. This sounds like combining the worst of multiple worlds.
  • by jeremyscanvic on 10/3/25, 7:44 PM

    Really excited about this - we've recently been struggling with making imports lazy without completely messing up the code in DeepInverse https://deepinv.github.io/deepinv/
  • by thijsvandien on 10/3/25, 10:36 PM

    Very unfortunate how from json lazy import dumps would result in backward compatibility issues. It reads much better and makes it easier to search for lazy imports, especially if in the future something else becomes optionally lazy as well.
  • by hoten on 10/3/25, 7:12 PM

    For posterity, as the submitted link seems temporary: https://github.com/python/peps/pull/4622
  • by gorgoiler on 10/3/25, 9:33 PM

    I don’t really agree with the premise:

    > The dominant convention in Python code is to place all imports at the beginning of the file. This avoids repetition, makes import dependencies clear and minimizes runtime overhead.

    > A somewhat common way to delay imports is to move the imports into functions, but this practice requires more work [and] obfuscates the full set of dependencies for a module.

    The first part is just saying the traditions exist because the traditions have always existed. Traditions are allowed to change!

    The second part is basically saying if you do your own logic-based lazy imports (inline imports in functions) then you’re going against the traditions. Again, traditions are allowed to change!

    The point about the import graph being obfuscated would ring more true if Python didn’t already provide lightning fast static analysis tools like ast. If you care about import graphs at the module level then you’re probably already automating everything with ast anyway, at which point you just walk the whole tree looking for imports rather than the top level.

    So, really, the whole argument for a new lazy keyword (instead of inlining giant imports where they are needed) is because people like to see import pytorch at the top of the file, and baulk at seeing it — and will refuse to even look for it! — anywhere else? Hmmm.

    What does seem like a pain in the ass is having to do this kind of repetitive crap (which they mention in their intro):

      def f1():
        import pytorch
        …
    
      def f2():
        import pytorch
        …
    
      def f3():
        import pytorch
        …
    
    But perhaps the solution is a pattern where you put all your stuff like that in your own module and it’s that module which is lazy loaded instead?
  • by thatxliner on 10/5/25, 12:53 AM

    I was looking at PEP 690 and I saw

    > A world in which Python only supported imports behaving in a lazy manner would likely be great...we do not envision the Python langauge transitioning to a world where lazy imports are the default...this concept would add complexity to our ecosystem.

    Why can't lazy be the default and instead propose an `eager` syntax? The only argument I can imagine is that there's some API that runs a side effect based on importing, but perhaps making it eager for modules with side effects would be a sufficient temporary fix?

  • by the__alchemist on 10/3/25, 7:25 PM

    Does this fix the circular imports problem that comes up if you don't structural your programs in a hierarchical way?
  • by nodesocket on 10/3/25, 9:42 PM

    Won’t this have a very noticeable performance hit on the first request? Thinking web frameworks like Flask and Django.
  • by d_burfoot on 10/4/25, 12:48 AM

    Looks good to me. I use a tab-completion trick where the tab-completer tool calls the script I'm about to invoke with special arguments, and the script reflects on itself and responds with possible completions. But because of slow imports, it often takes a while for the completion to respond.

    I could, and sometimes do, go through all the imports to figure out which ones are taking a long time to load, but it's a chore.

  • by mekoka on 10/3/25, 10:46 PM

    Crossing my fingers that it goes through this time. Been in the top 3 of my Python wishlist for nearly a decade.
  • by shiandow on 10/4/25, 9:35 AM

    I don't quite understand why they forbid this from being used in functions.

    I mean I get why that makes the most sense in most scenarios, but it is not as if the problem of having to choose between declaring dependencies up front or deferring expensive imports until needed does not happen in functions.

    Take for instance a function that quickly fails because the arguments are incorrect, it might do a whole bunch of imports that only make sense for that function but which are made immediately obsolete.

    It feels like it is forbidden just because someone thought it wasn't a good coding style but to me there is no obvious reason it couldn't work.

  • by Hendrikto on 10/4/25, 11:20 AM

    Python imports are already way too complicated. I don’t think even more syntax should be added.
  • by waynesonfire on 10/3/25, 8:20 PM

    why does this have to be a syntax feature and not a lazy code loader at the intepreter level?
  • by hhthrowaway1230 on 10/4/25, 4:04 PM

    doesn't this make the language a little unpredictable in terms of loading times? requiring to touch all parts to fully load the app?
  • by behnamoh on 10/3/25, 7:34 PM

    Make Python more like Haskell, LFG!
  • by keeganpoppen on 10/3/25, 7:30 PM

    YES!!!
  • by enriquto on 10/3/25, 9:29 PM

    what is the point of this? you can just import inside function definitions:

        def antislash(A, b):
            from numpy.linalg import solve
            return solve(A, b)
    
    thus numpy.linalg is only imported the first time you call the antislash function. Much cleaner than a global import.

    Ignore wrong traditions. Put all imports in the innermost scopes of your code!

  • by Alir3z4 on 10/3/25, 7:40 PM

    I wish all imports were lazy by default.

    I know/heard there are "some" (which I haven't seen by the way) libraries that depend on import side effects, but the advantage is much bigger.

    First of all, the circular import problem will go away, especially on type hints. Although there was a PEP or recent addition to make the annotation not not cause such issue.

    Second and most important of all, is the launch time of Python applications. A CLI that uses many different parts of the app has to wait for all the imports to be done.

    The second point becomes a lot painful when you have a large application, like a Django project where the auto reload becomes several seconds. Not only auto reload crawls, the testing cycle is slow as well. Every time you want to run test command, it has to wait several seconds. Painful.

    So far the solution has been to do the lazy import by importing inside the methods where it's required. That is something, I never got to like to be honest.

    Maybe it will be fixed in Python 4, where the JIT uses the type hints as well /s

  • by IshKebab on 10/3/25, 7:13 PM

    Wake me up when we can import a module by relative file path.