Imported from GitHub PR https://github.com/openxla/xla/pull/19192 This PR: - updates broken links in the README, and overview and contribution-related docs - removes mention of the mlperf submission from 2020 - re-formats a couple of code cells (more information inline) Copybara import of the project: -- 3aea33a924289825016ec9d3e807605938cf86d7 by Pavithra Eswaramoorthy <pavithraes@outlook.com>: Minor docs updates for releavnce and accuracy Signed-off-by: Pavithra Eswaramoorthy <pavithraes@outlook.com> Merging this change closes #19192 PiperOrigin-RevId: 694691283
1.6 KiB
XLA
XLA (Accelerated Linear Algebra) is an open-source machine learning (ML) compiler for GPUs, CPUs, and ML accelerators.
The XLA compiler takes models from popular ML frameworks such as PyTorch, TensorFlow, and JAX, and optimizes them for high-performance execution across different hardware platforms including GPUs, CPUs, and ML accelerators.
Get started
If you want to use XLA to compile your ML project, refer to the corresponding documentation for your ML framework:
If you're not contributing code to the XLA compiler, you don't need to clone and build this repo. Everything here is intended for XLA contributors who want to develop the compiler and XLA integrators who want to debug or add support for ML frontends and hardware backends.
Contribute
If you'd like to contribute to XLA, review How to Contribute and then see the developer guide.
Contacts
- For questions, contact the maintainers - maintainers at openxla.org
Resources
Code of Conduct
While under TensorFlow governance, all community spaces for SIG OpenXLA are subject to the TensorFlow Code of Conduct.