Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
4ac5909
Upgrade to Python 3.12 (#2421)
akenmorris Sep 17, 2025
4d5bb13
Typo on eigen version
akenmorris Sep 17, 2025
0083761
Try downgrading spdlog back for:
akenmorris Sep 17, 2025
f2e53b5
Fix open3d install
akenmorris Sep 17, 2025
460001c
Try older jupyter versions
akenmorris Sep 17, 2025
f1b88ac
Fix requirements file.
akenmorris Sep 17, 2025
084f316
Update python_requirements
akenmorris Sep 17, 2025
a258276
Fix notebook version
akenmorris Sep 17, 2025
fa2bd17
Upgrading build image from Ubuntu focal 20.04 to jammy 22.04
akenmorris Sep 18, 2025
945422f
Downgrade itk-elastix to 0.23.0 for mac (intel):
akenmorris Sep 18, 2025
0374068
Downgrade itk-meshtopolydata==0.11.1 to itk-meshtopolydata==0.11.0 fo…
akenmorris Sep 18, 2025
eb09610
Restore conda stuff
akenmorris Sep 18, 2025
cb44f0f
Revert conda package updates
akenmorris Sep 19, 2025
4a6292b
Unpin boost and pybind.
akenmorris Sep 19, 2025
7cba02b
Unpin zlib
akenmorris Sep 19, 2025
da0edd5
Fix conda package pins
akenmorris Sep 19, 2025
164ce30
Pin pytorch
akenmorris Sep 20, 2025
c050a28
Try pytorch 2.2.2 for intel mac
akenmorris Sep 20, 2025
23d06d4
Add zlib for windows
akenmorris Sep 20, 2025
9d422c3
Attempt to fix windows build
akenmorris Sep 20, 2025
50502c4
We should not need gmock via conda if we are using cmake to fetch
akenmorris Sep 21, 2025
84009ee
Attempting to work around this error on windows:
akenmorris Sep 21, 2025
ee017f5
Adding debug statements
akenmorris Sep 21, 2025
e65b182
More debugging windows numpy issue
akenmorris Sep 22, 2025
b6efe54
Try small fix
akenmorris Sep 22, 2025
2d3cef3
Extra debugging
akenmorris Sep 22, 2025
e478638
Adding more debugging for windows GHA numpy problem
akenmorris Sep 22, 2025
fabf8fb
More debugging.
akenmorris Sep 22, 2025
c71f34d
More debugging
akenmorris Sep 22, 2025
c85a27c
More debugging
akenmorris Sep 23, 2025
1252082
Add action-tmate for windows
akenmorris Sep 23, 2025
4115ff5
More debugging
akenmorris Sep 23, 2025
02e3137
Revert Optimize.cpp
akenmorris Sep 23, 2025
cae74f1
Skipping OptimizeTests.embedded_python_test
akenmorris Sep 23, 2025
87f30cd
Add dep build caching to other workflows
akenmorris Sep 23, 2025
a06da3a
Full update Python API version to 6.7
akenmorris Sep 23, 2025
730aeba
Remove restore-keys because we don't want to match other keys.
akenmorris Oct 13, 2025
c32bb95
Always build dependencies in Release mode now. There is not enough d…
akenmorris Oct 22, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 7 additions & 10 deletions .github/workflows/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,27 +1,22 @@
# Based on old ubuntu to create more compatible binaries

# To build (e.g. for ShapeWorks 6.5):
# docker build --progress=plain -t akenmorris/ubuntu-build-box-focal-sw65 .
# To build (e.g. for ShapeWorks 6.7):
# docker build --progress=plain -t akenmorris/ubuntu-build-box-jammy-sw67 .
# To publish:
# docker push akenmorris/ubuntu-build-box-focal-sw65
# docker push akenmorris/ubuntu-build-box-jammy-sw67

FROM ubuntu:focal-20240123 AS env
FROM ubuntu:jammy-20250819 AS env
MAINTAINER akenmorris@gmail.com

# Set environment variables
ENV PATH=/opt/conda/bin:/opt/rh/devtoolset-9/root/usr/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
ENV LDFLAGS=-L/opt/conda/lib
ENV PATH=/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

ARG DEBIAN_FRONTEND=noninteractive
ENV TZ=Etc/UTC

# Update
RUN apt-get update -y && apt-get upgrade -y && apt-get dist-upgrade -y && apt-get install build-essential software-properties-common -y && add-apt-repository ppa:ubuntu-toolchain-r/test -y && apt-get update -y

# Install GCC 9
RUN apt-get install gcc-9 g++-9 -y
RUN update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-9 60 --slave /usr/bin/g++ g++ /usr/bin/g++-9 && update-alternatives --config gcc

# Install git and git-lfs
RUN add-apt-repository ppa:git-core/ppa
RUN apt-get update
Expand All @@ -36,6 +31,8 @@ RUN apt-get install rsync freeglut3-dev libgl1-mesa-dev libegl1-mesa zip libcups
RUN curl https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -o /tmp/Miniconda3-latest-Linux-x86_64.sh \
&& bash /tmp/Miniconda3-latest-Linux-x86_64.sh -b -p /opt/conda \
&& ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh \
&& /opt/conda/bin/conda tos accept --override-channels --channel https://repo.anaconda.com/pkgs/main \
&& /opt/conda/bin/conda tos accept --override-channels --channel https://repo.anaconda.com/pkgs/r \
&& conda update -n base -c defaults conda \
&& conda install pip \
&& echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc \
Expand Down
5 changes: 1 addition & 4 deletions .github/workflows/build-linux-debug.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ jobs:
build:

runs-on: ubuntu-latest
container: akenmorris/ubuntu-build-box-focal-sw65
container: akenmorris/ubuntu-build-box-jammy-sw67

steps:

Expand Down Expand Up @@ -67,8 +67,6 @@ jobs:
with:
path: /github/home/install
key: ${{ runner.os }}-deps-debug-${{ hashFiles('.github/workflows/gha_deps.sh', 'install_shapeworks.sh', 'python_requirements.txt', 'build_dependencies.sh') }}
restore-keys: |
${{ runner.os }}-deps-

- name: Check space3.5
run: df -h
Expand Down Expand Up @@ -101,7 +99,6 @@ jobs:
run: df -h

- name: Build Dependencies
if: steps.cache-deps-restore.outputs.cache-hit != 'true'
shell: bash -l {0}
run: .github/workflows/gha_deps.sh

Expand Down
4 changes: 1 addition & 3 deletions .github/workflows/build-linux.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ jobs:
build:

runs-on: ubuntu-latest
container: akenmorris/ubuntu-build-box-focal-sw65
container: akenmorris/ubuntu-build-box-jammy-sw67

steps:

Expand Down Expand Up @@ -65,8 +65,6 @@ jobs:
with:
path: /github/home/install
key: ${{ runner.os }}-deps-${{ hashFiles('.github/workflows/gha_deps.sh', 'install_shapeworks.sh', 'python_requirements.txt', 'build_dependencies.sh') }}
restore-keys: |
${{ runner.os }}-deps-

- name: try import vtk
shell: bash -l {0}
Expand Down
2 changes: 0 additions & 2 deletions .github/workflows/build-mac-arm64.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,8 +56,6 @@ jobs:
with:
path: /Users/runner/install
key: ${{ runner.os }}-arm64-deps-${{ hashFiles('.github/workflows/gha_deps.sh', 'install_shapeworks.sh', 'python_requirements.txt', 'build_dependencies.sh') }}
restore-keys: |
${{ runner.os }}-deps-

- name: Build Dependencies
if: steps.cache-deps-restore.outputs.cache-hit != 'true'
Expand Down
3 changes: 0 additions & 3 deletions .github/workflows/build-mac.yml
Original file line number Diff line number Diff line change
Expand Up @@ -57,11 +57,8 @@ jobs:
with:
path: /Users/runner/install
key: ${{ runner.os }}-intel-deps-${{ hashFiles('.github/workflows/gha_deps.sh', 'install_shapeworks.sh', 'python_requirements.txt', 'build_dependencies.sh') }}
restore-keys: |
${{ runner.os }}-deps-

- name: Build Dependencies
if: steps.cache-deps-restore.outputs.cache-hit != 'true'
shell: bash -l {0}
run: .github/workflows/gha_deps.sh

Expand Down
9 changes: 6 additions & 3 deletions .github/workflows/build-windows.yml
Original file line number Diff line number Diff line change
Expand Up @@ -93,8 +93,6 @@ jobs:
with:
path: C:\deps
key: ${{ runner.os }}-deps-${{ hashFiles('.github/workflows/gha_deps.sh', 'install_shapeworks.sh', 'python_requirements.txt', 'build_dependencies.sh') }}
restore-keys: |
${{ runner.os }}-deps-

- name: Build Dependencies
if: steps.cache-deps-restore.outputs.cache-hit != 'true'
Expand Down Expand Up @@ -164,7 +162,12 @@ jobs:
# Execute tests defined by the CMake configuration.
# See https://cmake.org/cmake/help/latest/manual/ctest.1.html for more detail
run: conda activate shapeworks && source ${GITHUB_WORKSPACE}/devenv.sh ./bin/Release && ctest --output-on-failure -VV -C $BUILD_TYPE --debug


- name: Setup tmate session on failure
if: ${{ failure() }}
uses: mxschmitt/action-tmate@v3
timeout-minutes: 30

- uses: actions/upload-artifact@v4
with:
name: artifact-${{github.sha}}-windows
Expand Down
3 changes: 2 additions & 1 deletion .github/workflows/gha_deps.sh
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ else
NPROCS=2
fi

./build_dependencies.sh --build-type=$BUILD_TYPE --num-procs=$NPROCS --clean-after
# Always build dependencies in Release mode now. There is not enough disk space for linux-debug on GHA anymore.
./build_dependencies.sh --build-type=Release --num-procs=$NPROCS --clean-after
rm -rf $BUILD_DIR
fi
2 changes: 1 addition & 1 deletion CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ endif()
set(CMAKE_CXX_STANDARD 17) # available options are [98, 11, 14, 17. 20]

list(APPEND CMAKE_MODULE_PATH "${CMAKE_CURRENT_LIST_DIR}/cmake")
find_package(Python3 3.9 EXACT REQUIRED COMPONENTS Interpreter Development)
find_package(Python3 3.12 EXACT REQUIRED COMPONENTS Interpreter Development)

if (NOT APPLE)
option(USE_OPENMP "Build using OpenMP" ON)
Expand Down
2 changes: 1 addition & 1 deletion Libs/Application/Job/PythonWorker.h
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ class PythonWorker : public QObject {
Q_OBJECT

public:
constexpr static const char* python_api_version = "6.6";
constexpr static const char* python_api_version = "6.7";

PythonWorker();
~PythonWorker();
Expand Down
2 changes: 1 addition & 1 deletion Libs/Groom/Groom.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -535,7 +535,7 @@ void Groom::increment_progress(int amount) {
std::scoped_lock lock(mutex);
progress_counter_ += amount;
progress_ = static_cast<float>(progress_counter_) / static_cast<float>(total_ops_) * 100.0;
SW_PROGRESS(progress_, fmt::format("Grooming ({}/{} ops)", progress_counter_, total_ops_));
SW_PROGRESS(progress_, fmt::format("Grooming ({}/{} ops)", progress_counter_.load(), total_ops_.load()));
}

//---------------------------------------------------------------------------
Expand Down
6 changes: 6 additions & 0 deletions Libs/Python/ShapeworksPython.cpp
Original file line number Diff line number Diff line change
@@ -1,4 +1,10 @@


#ifdef _MSC_VER
#include <BaseTsd.h>
typedef SSIZE_T ssize_t;
#endif

#include <Optimize/Optimize.h>

#include <Eigen/Eigen>
Expand Down
2 changes: 1 addition & 1 deletion Libs/Utils/StringUtils.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ std::vector<std::string> StringUtils::getFileNamesFromPaths(const std::vector<st

//---------------------------------------------------------------------------
std::string StringUtils::getLowerExtension(const std::string& filename) {
return boost::algorithm::to_lower_copy(boost::filesystem::extension(filename));
return boost::algorithm::to_lower_copy(boost::filesystem::path(filename).extension().string());
}

//---------------------------------------------------------------------------
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ def fit(self, embedded_matrix):
nearest_neighbor_dists = []
cov = np.cov(embedded_matrix.T)
for i in embedded_matrix:
smallest = np.Inf
smallest = np.inf
for j in embedded_matrix:
dist = Mdist(i,j,cov)
if dist < smallest and dist != 0:
Expand Down
4 changes: 2 additions & 2 deletions Python/DeepSSMUtilsPackage/DeepSSMUtils/eval.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,13 @@ def test(config_file, loader="test"):

# load the loaders
sw_message("Loading " + loader + " data loader...")
test_loader = torch.load(loader_dir + loader)
test_loader = torch.load(loader_dir + loader, weights_only=False)

# initialization
sw_message("Loading trained model...")
if parameters['tl_net']['enabled']:
model_tl = model.DeepSSMNet_TLNet(config_file)
model_tl.load_state_dict(torch.load(model_path))
model_tl.load_state_dict(torch.load(model_path, weights_only=False))
device = model_tl.device
model_tl.to(device)
model_tl.eval()
Expand Down
4 changes: 2 additions & 2 deletions Python/DeepSSMUtilsPackage/DeepSSMUtils/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,7 +105,7 @@ def __init__(self, config_file):
parameters = json.load(json_file)
self.num_latent = parameters['num_latent_dim']
self.loader_dir = parameters['paths']['loader_dir']
loader = torch.load(self.loader_dir + "validation")
loader = torch.load(self.loader_dir + "validation", weights_only=False)
self.num_corr = loader.dataset.mdl_target[0].shape[0]
img_dims = loader.dataset.img[0].shape
self.img_dims = img_dims[1:]
Expand Down Expand Up @@ -197,4 +197,4 @@ def forward(self, pt, x):
z = self.CorrespondenceEncoder(pt1)
pt_out = self.CorrespondenceDecoder(z)
zt, _ = self.ImageEncoder(x)
return [pt_out.view(-1, pt.shape[1], pt.shape[2]), z, zt]
return [pt_out.view(-1, pt.shape[1], pt.shape[2]), z, zt]
10 changes: 5 additions & 5 deletions Python/DeepSSMUtilsPackage/DeepSSMUtils/trainer.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,8 +104,8 @@ def supervised_train(config_file):
train_loader_path = loader_dir + "train"
validation_loader_path = loader_dir + "validation"
print("Loading data loaders...")
train_loader = torch.load(train_loader_path)
val_loader = torch.load(validation_loader_path)
train_loader = torch.load(train_loader_path, weights_only=False)
val_loader = torch.load(validation_loader_path, weights_only=False)
print("Done.")
# initializations
num_pca = train_loader.dataset.pca_target[0].shape[0]
Expand Down Expand Up @@ -164,7 +164,7 @@ def supervised_train(config_file):
plot_train_losses = []
plot_val_losses = []
t0 = time.time()
best_val_rel_error = np.Inf
best_val_rel_error = np.inf
for e in range(1, num_epochs + 1):
if sw_check_abort():
sw_message("Aborted")
Expand Down Expand Up @@ -300,7 +300,7 @@ def supervised_train(config_file):
for param in net.decoder.fc_fine.parameters():
param.requires_grad = True
# train on the corr loss
best_ft_val_rel_error = np.Inf
best_ft_val_rel_error = np.inf
for e in range(1, ft_epochs + 1):
if sw_check_abort():
sw_message("Aborted")
Expand Down Expand Up @@ -453,7 +453,7 @@ def supervised_train_tl(config_file):
plot_train_losses = []
plot_val_losses = []
t0 = time.time()
best_val_rel_error = np.Inf
best_val_rel_error = np.inf

# train the AE first

Expand Down
4 changes: 3 additions & 1 deletion Testing/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,14 @@ include(FetchContent)
FetchContent_Declare(
googletest
GIT_REPOSITORY https://github.com/google/googletest.git
GIT_TAG release-1.11.0
GIT_TAG v1.14.0
)
FetchContent_GetProperties(googletest)
if(NOT googletest_POPULATED)
FetchContent_Populate(googletest)
add_subdirectory(${googletest_SOURCE_DIR} ${googletest_BINARY_DIR})
endif()
FetchContent_MakeAvailable(googletest)

set(gtest_force_shared_crt ON CACHE BOOL "" FORCE)

Expand Down Expand Up @@ -52,6 +53,7 @@ target_include_directories(Testing PUBLIC
$<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>
$<INSTALL_INTERFACE:include>)
target_link_libraries(Testing
gtest
gtest_main
${Boost_LIBRARIES}
)
Expand Down
3 changes: 1 addition & 2 deletions Testing/GroomTests/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,8 @@ add_executable(GroomTests
)

target_link_libraries(GroomTests
${ITK_LIBRARIES} ${VTK_LIBRARIES}
Testing ${ITK_LIBRARIES} ${VTK_LIBRARIES}
Mesh Groom Project Image
Testing
)

add_test(NAME GroomTests COMMAND GroomTests)
5 changes: 4 additions & 1 deletion Testing/ImageTests/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,14 @@ add_executable(ImageTests
${TEST_SRCS}
)

add_dependencies(ImageTests gtest gtest_main)


target_link_libraries(ImageTests
Testing
Image
Common
${ITK_LIBRARIES}
Testing
Project
)

Expand Down
4 changes: 2 additions & 2 deletions Testing/MeshTests/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@ add_executable(MeshTests
)

target_link_libraries(MeshTests
tinyxml Optimize Mesh Utils Particles trimesh2 Eigen3::Eigen igl::core
Testing tinyxml Optimize Mesh Utils Particles trimesh2 Eigen3::Eigen igl::core
${ITK_LIBRARIES} ${VTK_LIBRARIES}
Testing Project
Project
)

add_test(NAME MeshTests COMMAND MeshTests)
4 changes: 2 additions & 2 deletions Testing/OptimizeTests/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ add_executable(OptimizeTests
)

target_link_libraries(OptimizeTests
Mesh Optimize Utils trimesh2 Particles
Testing pybind11::embed Project Image
Testing Mesh Optimize Utils trimesh2 Particles
pybind11::embed Project Image
)

add_test(NAME OptimizeTests COMMAND OptimizeTests)
Expand Down
13 changes: 10 additions & 3 deletions Testing/OptimizeTests/OptimizeTests.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@

#include <cstdio>

#include "../Testing.h"
#include "Libs/Optimize/Domain/Surface.h"
#include "Optimize.h"
#include "OptimizeParameterFile.h"
#include "ParticleShapeStatistics.h"
#include "Testing.h"

using namespace shapeworks;

Expand Down Expand Up @@ -273,6 +273,13 @@ TEST(OptimizeTests, mesh_use_normals_test) {
TEST(OptimizeTests, embedded_python_test) {
prep_temp("/simple", "embedded_python");

// disable test on windows
// Note that this test works fine on native windows, but something about GitHub Actions Runner is causing
// An error when importing numpy. For now, we are just going to skip this test on windows
#ifdef _WIN32
GTEST_SKIP() << "Skipping embedded_python_test on Windows";
#endif

// run with parameter file
std::string paramfile = std::string("python_embedded.xml");
Optimize app;
Expand Down Expand Up @@ -523,7 +530,7 @@ TEST(OptimizeTests, cutting_plane_test) {

// make sure we clean out at least one output file
std::remove("optimize_particles/sphere10_DT_world.particles");

auto start = shapeworks::ShapeWorksUtils::now();

// run with parameter file
Expand All @@ -541,7 +548,7 @@ TEST(OptimizeTests, cutting_plane_test) {
stats.principal_component_projections();

bool good = check_constraint_violations(app, 1.5e-1);

auto end = shapeworks::ShapeWorksUtils::now();
std::cout << "Time taken to run cutting_plane optimize test: "
<< shapeworks::ShapeWorksUtils::elapsed(start, end, false) << "sec \n";
Expand Down
4 changes: 2 additions & 2 deletions Testing/ParticlesTests/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@ add_executable(ParticlesTests
)

target_link_libraries(ParticlesTests
Mesh Optimize Utils trimesh2 Particles
Testing pybind11::embed Project Image
Testing Mesh Optimize Utils trimesh2 Particles
pybind11::embed Project Image
)

add_test(NAME ParticlesTests COMMAND ParticlesTests)
Loading
Loading