Skip to content

Conversation

@rolfmorel
Copy link
Contributor

Simple wrapper around upstream's transform_interpreter API. Invokable directly from Python and via the commandline on separate payload and schedule files.

Simple wrapper around upstream's transform_interpreter API. Invokable
directly from Python and via the commandline on separate payload and
schedule files.

if __name__ == "__main__":
ArgParser = argparse.ArgumentParser()
ArgParser.add_argument("schedule")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please, add arg docs for --help


from mlir import ir

from .. import transform as lh_transform
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not just from .. import transform?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did it as transform is/will be a very common module name for the bindings to the Transform dialect.

If we are okay with transform being used here for the more general concept of "lowering & optimization", I will turn it into just transform.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

btw, may we use an absolute import here? just to make it clearer what's imported here

Suggested change
from .. import transform as lh_transform
import lighthouse.transform as lh_transform

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we should use another name for the module instead of transform as indeed it is a common name in this realm? Could e.g. be schedule as we are defining and applying transform schedules?

from mlir.dialects import arith, func, linalg, tensor, transform
from mlir.dialects.transform import structured

from lighthouse import transform as lh_transform
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we always have lighthouse in the PYTHON_PATH?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think in pretty much all Python files in this repo we want to assume that lighthouse is a package on the PYTHON_PATH / available from sys.path.

The README gives instructions on how to get to this point ($ uv sync # Install the mlir-python-bindings and lighthouse packages). If the user wants to not keep to these instructions, that's fine, though anyone doing that is assumed to understand that they are expected to bring up an environment with the same packages available -- @dchigarev even added suggestions for how to get just lighthouse into your environment using just pip.

subprocess.run(cmdline)
print(
f"NOTE: cleaning-up temp files: {payload_file.name}, {schedule_file.name}"
)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we also add some checks, here? Like using FileCheck to make sure the ops were folded?


def example_schedule() -> ir.Module:
schedule = ir.Module.create()
schedule.operation.attributes["transform.with_named_sequence"] = ir.UnitAttr.get()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be better if the transform module provided this as a helper, so that users only need to add the transforms themselves.

Copy link
Contributor Author

@rolfmorel rolfmorel Nov 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, we should probably be facilitating dealing with such details a bit more. The thing I would not like is to obscure that we are just constructing IR, i.e. to me this code still reflects what the .mlir will look like. We should find a middle ground somehow. So, the helpers should look like IR builders as well ... probably?


def example_payload() -> ir.Module:
payload = ir.Module.create()
with ir.InsertionPoint(payload.body):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should also be a helper, eventually be part of the MLIR importer module, which should be reused by the schedule, too, if we all agree with this design: https://github.com/llvm/lighthouse/wiki/Integrator

@rengolin rengolin requested a review from Groverkss November 11, 2025 13:24
Copy link
Contributor

@adam-smnk adam-smnk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really neat and easy to use 👍

Comment on lines +72 to +76
with tempfile.NamedTemporaryFile(
"w", prefix="payload_"
) as payload_file, tempfile.NamedTemporaryFile(
"w", prefix="schedule_"
) as schedule_file:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: maybe these could be grouped together as:

with (
    temp.. as file1,
    temp.. as file2,
):

Comment on lines +6 to +7
assert schedule.parent and "transform.with_named_sequence" in schedule.parent.attributes
assert "transform.with_named_sequence" in schedule.parent.attributes
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aren't these two asserts the same?

payload_file.name,
]
print("NOTE: output of applying schedule to payload from commandline:", *cmdline)
subprocess.run(cmdline)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: print order is mixed up when I pipe output to a file
How about an explicit print? Sth like:

res = subprocess.run(cmdline, capture_output=True)
print(res.stdout.decode())

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A general question, so pardon my python ignorance 🙂
Does this file have to be named main.py? I take that __main__.py already serves as the entry point for CLI invocation.

If name can be arbitrary then maybe sth more unique/expressive?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file can indeed be named anything. In essence the apply function could even live inside the __init__.py though I personally often assume that __init__.py holds little more than re-exports.

I would like there to be a better name for the file, though am drawing a bit of a blank. So ... suggestions welcome! 😄

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to know 👍
My first thought about __init__.py is the same. Feels cleaner to keep it as a "module summary".

Name-wise hmm... Hard to judge how it will evolve, main is a decent placeholder.
Maybe interpreter? As in helpers for managing and executing schedules. Although, perhaps a bit too broad.
I'm always in catch-all utils camp - lh.transform.utils as helpers for managing transforms. Some ppl hate it 😜

If nothing particularly resonates with you, let's leave it as is. It'll be easier to rename it once we add a few more helpers.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have a mild preference for using mymodule/mymodule.py naming convention if there's only one source file.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants