Skip to content

Commit ca1c669

Browse files
authored
Merge pull request #26 from ZedongPeng/update-readme
docs: update readme about differentiation
2 parents edfa813 + c608630 commit ca1c669

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,11 +12,11 @@
1212
MPAX is a hardware-accelerated, differentiable, batchable, and distributable solver for mathematical programming in JAX, designed to integrate with modern computational and deep learning workflows:
1313

1414
- **Hardware accelerated**: executes on multiple architectures including CPUs, GPUs and TPUs.
15-
- **Differentiable**: computes derivatives of solutions with respect to inputs through implicit or unrolled differentiation.
15+
- **Differentiable**: computes derivatives of solutions with respect to inputs through unrolled differentiation.
1616
- **Batchable**: solves multiple problem instances of the same shape simultaneously.
1717
- **Distributed**: executes in parallel across multiple devices, such as several GPUs.
1818

19-
MPAX's primary motivation is to integrate mathematical programming with deep learning pipelines. To achieve this, MPAX aligns its algorithms and implementations with the requirements of deep learning hardware, ensuring compatibility with GPUs and TPUs. By being differentiable, MPAX can integrate directly into the backpropagation process of neural network training. Its batchability and distributability further enable scalable deployment in large-scale applications.
19+
MPAX's primary motivation is to integrate mathematical programming with deep learning pipelines. To achieve this, MPAX aligns its algorithms and implementations with the requirements of deep learning hardware, ensuring compatibility with GPUs and TPUs. By being differentiable, MPAX can integrate directly into the backpropagation process of neural network training. Its batchability and distributability further enable scalable deployment in large-scale applications.
2020

2121
Currently, MPAX supports **linear programming (LP)** and **quadratic programming (QP)**, the foundational problems in mathematical programming. Future releases will expand support to include other problem classes of mathematical programming.
2222

0 commit comments

Comments
 (0)