List view
Improve type checking throughout the project.
No due date•5/17 issues closedThis milestone is to align current Arkouda functions to NumPy behavior and semantics.
No due date•4/29 issues closedImplement the Pandas ExtensionArray API: https://pandas.pydata.org/docs/reference/api/pandas.api.extensions.ExtensionArray.html
No due date•27/67 issues closedThis includes both client-initiated checkpointing (`ak.save_checkpoint()` and `ak.load_checkpoint()`) and automatic checkpointing (`--checkpointMemPct` and `--checkpointIdleTime`). The milestone includes both implementation tasks and design questions, ex. "what should happen when ...".
No due date•0/5 issues closedReorganize directory structure to match numpy and pandas
No due date•31/31 issues closedEach module needs to be examined to make sure that all functions have unit tests that cover the multi-dimensional case, if applicable. Any function that does not support multi-dimensional arrays must throw an error and be properly noted in the documentation.
No due date•21/39 issues closedUpdate the Makefile to ensure all the commands run without error and are added to the CI if appropriate.
No due date•4/5 issues closedAdd some functionality to match sklearn.
No due date•0/1 issues closedLook into JIT (just in time) for arkouda. If I'm not mistaken this is similar to what numba does for numpy I'm not an expert on this but it seems pretty similar to lazy evaluation that spark does. This is more of a research project than high priority at this point
No due date- No due date•0/1 issues closed
Similar to #3050 We need to go through each files in benchmarks_v2 and verify it is benchmarking the same thing as the original. This is far more nuanced than updating the tests. sort-cases in particular i remember being concerned about since it doesn't lend itself well to the new format (since it makes use of python generators, see #2276) One quick litmus test is to run the new and old benchmark and make sure the results are fairly similar @Bears-R-Us/arkouda-core-dev while going through these, we should keep in mind if it makes sense for something to be used as benchmark. Ideally benchmarks are capturing the core functionality that underpins most workflows. Unlike tests, I don't think we should try to benchmark as much of our functions as possible, but to focus on the ones that we really want to know if there's a performance drop off. I think we should keep this targeted to the ones we really care about, but I'm open to other opinions! I know some of these benchmarks were added as one off to see how a newly added function performed and to track it as we optimized Once we are confident these work and cover what we are looking for, we should work with @hokiegeek2 to use the JSON(?) output to create a grafana dashboard. Then we should work with @bmcdonald3 or @jeremiah-corrado to create a script that turns the output of these into something that is readable by the chpl nightly graphs I chatted offline with @bmcdonald3 and it seems like converting from the JSON to be readable by the chpl graphs would be quite difficult. I think the better approach is to find a way to add an option for our current benchmarks to give JSON output Helpful links: https://chapel-lang.org/docs/developer/bestPractices/TestSystem.html#a-performance-test https://chapel-lang.org/perf/arkouda/16-node-xc/?graphs=all https://github.com/Bears-R-Us/arkouda/blob/master/benchmarks/run_benchmarks.py
No due date•49/51 issues closedUpdate and clean up the docstrings.
No due date•29/73 issues closedCreate a statistical package in Arkouda to mirror Scipy.
No due date•2/4 issues closedAllow the work jeremiah has been doing in the array API to be accessible via the standard API
No due date•15/30 issues closedMisc. tickets related to the alignment of the arkouda API and the pandas API.
No due date•7/39 issues closed- No due date•79/104 issues closed
Deprecate all the tests in arkouda/tests. This involves 1. checking each module to make sure all the tests have been copied over to PROTO_tests/tests and verify they run. 2. Then the test file can be moved to arkouda/tests/deprecated. 3. Finally, update `pytest.ini `to remove the relevant line.
No due date•71/110 issues closedAlign our random module with numpys. This involves adding methods to sample from non-uniform distributions
No due date•14/16 issues closed