Skip to content

Build FAISS from source

Steven Dake edited this page Jul 22, 2023 · 5 revisions

Build from source

Information required to build our software and dependencies from source is within.

Our dependencies in June 2023:

Dependency build notes

This page contains information required to build dependencies for Debian 11. It appears most people like or tolerate Conda. As our filesystem is deployed read-only, using conda is not viable. This is a list of the packages and their relevant build notes.

Build faiss

Dependencies:

  • CUDA and its compilers (binary distribution)
  • Intel OneAPI MKL (binary distribution)
  • numpy (python)
  • swig (python)

We build faiss from source as the pypy versions of faiss don't appear to work for us. The faiss install documentation states that the Intel OneAPI MKL offers the best performance. I have not measured if this is accurate or outperforms the NVIDIA HPC SDK.

  • Install python dependencies:
pip install numpy swig
  • Install NVIDIA CUDA:
curl -LO https://developer.download.nvidia.com/compute/cuda/repos/debian11/x86_64/cuda-keyring_1.1-1_all.deb
sudo apt update
sudo apt install cuda
  • Install Intel OneAPI:

Add an apt repo described here from Intel.

sudo apt install intel-basekit
  • Clone the latest main of faiss:
git clone https://github.com/facebookresearch/faiss
  • Set environment's PATH and override it all with the Intel OneAPI environment loader:
export PATH=$PATH:$HOME/.local/bin:/usr/local/cuda/bin
source /opt/intel/oneapi/setvars.sh
  • Configure faiss using cmake:
cmake -B _build -DBUILD_SHARED_LIBS=ON -DFAISS_ENABLE_GPU=ON -DFAISS_ENABLE_PYTHON=ON -DFAISS_ENABLE_RAFT=ON -DBUILD_TESTING=OFF -DBUILD_SHARED_LIBS=ON -DFAISS_ENABLE_C_API=ON -DCMAKE_BUILD_TYPE=Release -DFAISS_OPT_LEVEL=avx2 -Wno-dev -DCMAKE_INSTALL_PREFIX=$HOME/target/faiss -DBLA_VENDOR=Intel10_64lp -DCMAKE_INSTALL_LIBDIR=lib -DCMAKE_BUILD_TYPE=Release .

If you have multiple blas libraries installed, you may find it wise to set the blas library type by adding the cmake override -DBLA_VENDOR=Intel10_64_dyn. Please note RAFT has its own chain-of-pain dependency requirements. Maybe alter.

  • Build faiss:
make -C build -j$(nproc) faiss
make -C build -j$(nproc) swigfaiss
pushd build/faiss/python;python3 setup.py bdist_wheel;popd
  • Install faiss:
make -C build -j$(nproc) install
pip install build/faiss/python/dist/faiss-1.7.4-py3-none-any.whl
  • Add the file /etc/ld.so.conf.d/artificial_wisdom-oneapi.conf with the contents:
/opt/intel/oneapi/mkl/latest/lib/intel64
/opt/intel/oneapi/compiler/2023.2.0/linux/compiler/lib/intel64_lin
  • Finally update the ld cache:
sudo ldconfig

You may find it of value to verify the linking of the final resulting shared object:

sdake@beast-05 ~> ldd /usr/local/lib/libfaiss.so
	linux-vdso.so.1 (0x00007ffc0c5f1000)
	libmkl_intel_lp64.so.2 => /opt/intel/oneapi/mkl/2023.1.0/lib/intel64/libmkl_intel_lp64.so.2 (0x00007fcbbcf7b000)
	libmkl_sequential.so.2 => /opt/intel/oneapi/mkl/2023.1.0/lib/intel64/libmkl_sequential.so.2 (0x00007fcbbb56f000)
	libmkl_core.so.2 => /opt/intel/oneapi/mkl/2023.1.0/lib/intel64/libmkl_core.so.2 (0x00007fcbb718e000)
	libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007fcbb716c000)
	libcudart.so.12 => /usr/local/cuda/targets/x86_64-linux/lib/libcudart.so.12 (0x00007fcbb6ec5000)
	libcublas.so.12 => /usr/local/cuda/targets/x86_64-linux/lib/libcublas.so.12 (0x00007fcbb063e000)
	libgomp.so.1 => /lib/x86_64-linux-gnu/libgomp.so.1 (0x00007fcbb05fc000)
	libstdc++.so.6 => /lib/x86_64-linux-gnu/libstdc++.so.6 (0x00007fcbb042f000)
	libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007fcbb02eb000)
	libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007fcbb02d1000)
	libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007fcbb00fd000)
	/lib64/ld-linux-x86-64.so.2 (0x00007fcbc2527000)
	libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007fcbb00f7000)
	librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007fcbb00eb000)
	libcublasLt.so.12 => /usr/local/cuda/targets/x86_64-linux/lib/libcublasLt.so.12 (0x00007fcb8e121000)

Clone this wiki locally