Skip to content

Add torch.log10 INT support to ExecuTorch Arm backend (#18671)#18671

Open
JorickvdHoeven wants to merge 1 commit intomainfrom
export-D99177468
Open

Add torch.log10 INT support to ExecuTorch Arm backend (#18671)#18671
JorickvdHoeven wants to merge 1 commit intomainfrom
export-D99177468

Conversation

@JorickvdHoeven
Copy link
Copy Markdown

@JorickvdHoeven JorickvdHoeven commented Apr 2, 2026

Summary:

Adds quantized (INT) support for torch.log10 in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the InsertTableOpsPass.

Changes:

  • Registered log10.default in the unary_table_ops dict in insert_table_ops.py for the LUT quantized path
  • Added log10.default to _one_to_one in quantization_annotator.py to enable quantization annotation
  • Added log10.default to TOSA_PRO_INT_SupportList in tosa_profile_supported_op_lists.py so the partitioner delegates it to the Arm backend
  • Wrote test/ops/test_log10.py with TOSA INT, U55 INT, and U85 INT test cases
  • Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468

@pytorch-bot
Copy link
Copy Markdown

pytorch-bot bot commented Apr 2, 2026

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/18671

Note: Links to docs will display an error until the docs builds have been completed.

❌ 3 New Failures, 1 Cancelled Job, 3 Unrelated Failures

As of commit 2eef568 with merge base 28f3cf3 (image):

NEW FAILURES - The following jobs have failed:

CANCELLED JOB - The following job was cancelled. Please retry:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 2, 2026
@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 2, 2026

This PR needs a release notes: label

If your change should be included in the release notes (i.e. would users of this library care about this change?), please use a label starting with release notes:. This helps us keep track and include your important work in the next release notes.

To add a label, you can comment to pytorchbot, for example
@pytorchbot label "release notes: none"

For more information, see
https://github.com/pytorch/pytorch/wiki/PyTorch-AutoLabel-Bot#why-categorize-for-release-notes-and-how-does-it-work.

@meta-codesync
Copy link
Copy Markdown
Contributor

meta-codesync bot commented Apr 2, 2026

@JorickvdHoeven has exported this pull request. If you are a Meta employee, you can view the originating Diff in D99177468.

@3l1 3l1 self-requested a review April 3, 2026 17:16
@meta-codesync meta-codesync bot changed the title Add torch.log10 INT support to ExecuTorch Arm backend Add torch.log10 INT support to ExecuTorch Arm backend (#18671) Apr 3, 2026
meta-codesync bot pushed a commit that referenced this pull request Apr 3, 2026
Summary:

Adds quantized (INT) support for `torch.log10` in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the `InsertTableOpsPass`.

Changes:
- Registered `log10.default` in the `unary_table_ops` dict in `insert_table_ops.py` for the LUT quantized path
- Added `log10.default` to `_one_to_one` in `quantization_annotator.py` to enable quantization annotation
- Added `log10.default` to `TOSA_PRO_INT_SupportList` in `tosa_profile_supported_op_lists.py` so the partitioner delegates it to the Arm backend
- Wrote `test/ops/test_log10.py` with TOSA INT, U55 INT, and U85 INT test cases
- Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468
@meta-codesync meta-codesync bot force-pushed the export-D99177468 branch from 5c6f96a to cc1c74f Compare April 3, 2026 17:27
JorickvdHoeven added a commit that referenced this pull request Apr 3, 2026
Summary:
Pull Request resolved: #18671

Adds quantized (INT) support for `torch.log10` in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the `InsertTableOpsPass`.

Changes:
- Registered `log10.default` in the `unary_table_ops` dict in `insert_table_ops.py` for the LUT quantized path
- Added `log10.default` to `_one_to_one` in `quantization_annotator.py` to enable quantization annotation
- Added `log10.default` to `TOSA_PRO_INT_SupportList` in `tosa_profile_supported_op_lists.py` so the partitioner delegates it to the Arm backend
- Wrote `test/ops/test_log10.py` with TOSA INT, U55 INT, and U85 INT test cases
- Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468
JorickvdHoeven added a commit that referenced this pull request Apr 3, 2026
Summary:
Pull Request resolved: #18671

Adds quantized (INT) support for `torch.log10` in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the `InsertTableOpsPass`.

Changes:
- Registered `log10.default` in the `unary_table_ops` dict in `insert_table_ops.py` for the LUT quantized path
- Added `log10.default` to `_one_to_one` in `quantization_annotator.py` to enable quantization annotation
- Added `log10.default` to `TOSA_PRO_INT_SupportList` in `tosa_profile_supported_op_lists.py` so the partitioner delegates it to the Arm backend
- Wrote `test/ops/test_log10.py` with TOSA INT, U55 INT, and U85 INT test cases
- Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468
meta-codesync bot pushed a commit that referenced this pull request Apr 3, 2026
Summary:

Adds quantized (INT) support for `torch.log10` in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the `InsertTableOpsPass`.

Changes:
- Registered `log10.default` in the `unary_table_ops` dict in `insert_table_ops.py` for the LUT quantized path
- Added `log10.default` to `_one_to_one` in `quantization_annotator.py` to enable quantization annotation
- Added `log10.default` to `TOSA_PRO_INT_SupportList` in `tosa_profile_supported_op_lists.py` so the partitioner delegates it to the Arm backend
- Wrote `test/ops/test_log10.py` with TOSA INT, U55 INT, and U85 INT test cases
- Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468
@meta-codesync meta-codesync bot force-pushed the export-D99177468 branch from 1c90c12 to 3d43271 Compare April 3, 2026 18:24
JorickvdHoeven added a commit that referenced this pull request Apr 3, 2026
Summary:
Pull Request resolved: #18671

Adds quantized (INT) support for `torch.log10` in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the `InsertTableOpsPass`.

Changes:
- Registered `log10.default` in the `unary_table_ops` dict in `insert_table_ops.py` for the LUT quantized path
- Added `log10.default` to `_one_to_one` in `quantization_annotator.py` to enable quantization annotation
- Added `log10.default` to `TOSA_PRO_INT_SupportList` in `tosa_profile_supported_op_lists.py` so the partitioner delegates it to the Arm backend
- Wrote `test/ops/test_log10.py` with TOSA INT, U55 INT, and U85 INT test cases
- Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468
@3l1 3l1 added partner: arm For backend delegation, kernels, demo, etc. from the 3rd-party partner, Arm ciflow/trunk module: arm Issues related to arm backend labels Apr 3, 2026
Summary:

Adds quantized (INT) support for `torch.log10` in the ExecuTorch Arm backend using the lookup table (LUT) path. TOSA has no native LOG10 op, so for quantized inference the op is handled via a precomputed table in the `InsertTableOpsPass`.

Changes:
- Registered `log10.default` in the `unary_table_ops` dict in `insert_table_ops.py` for the LUT quantized path
- Added `log10.default` to `_one_to_one` in `quantization_annotator.py` to enable quantization annotation
- Added `log10.default` to `TOSA_PRO_INT_SupportList` in `tosa_profile_supported_op_lists.py` so the partitioner delegates it to the Arm backend
- Wrote `test/ops/test_log10.py` with TOSA INT, U55 INT, and U85 INT test cases
- Registered the test in `test/targets.bzl

Reviewed By: 3l1

Differential Revision: D99177468
@meta-codesync meta-codesync bot force-pushed the export-D99177468 branch from 24d54e6 to 2eef568 Compare April 3, 2026 18:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ciflow/trunk CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported meta-exported module: arm Issues related to arm backend partner: arm For backend delegation, kernels, demo, etc. from the 3rd-party partner, Arm

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants