From 333e9fb973b0daf9e40cb1fbc7b57615bd2f5937 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Sat, 14 Sep 2024 15:37:11 +0200 Subject: [PATCH 01/33] Start article about the history of conda-forge --- community/_sidebar.json | 1 + community/history.md | 61 +++++++++++++++++++++++++++++++++++++++++ community/index.md | 2 +- 3 files changed, 63 insertions(+), 1 deletion(-) create mode 100644 community/history.md diff --git a/community/_sidebar.json b/community/_sidebar.json index 6ad28bbd33..0eae5f6229 100644 --- a/community/_sidebar.json +++ b/community/_sidebar.json @@ -1,5 +1,6 @@ [ "index", + "history", "getting-in-touch", { "type": "category", diff --git a/community/history.md b/community/history.md new file mode 100644 index 0000000000..6b1159b5b0 --- /dev/null +++ b/community/history.md @@ -0,0 +1,61 @@ +--- +title: History +--- + +# History of conda-forge + +conda-forge's origins cannot be explained without understanding the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, specially on Windows, as it often meant compiling dependencies from source. + +Python 2.x was the norm, the community was transitioning from `easy_install` to `pip`, and there wouldn't be an alternative for Python eggs [^eggs] until 2012, when wheels are introduced [^wheels]. To get Python, you'd get the official installers from Python.org, stick to the system provided one in Linux, or resort to ActiveState's or Enthought's distributions in macOS and Windows [^legacy-python-downloads]. + +## The origins of `conda` + +In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks needed [^packaging-and-deployment-with-conda] [^lex-fridman-podcast]. + +In constrast with Python eggs and wheels, conda packages were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them under each Python package. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". + +By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release is announced. By the end of the year, Continuum Analytics announces Binstar.org, the predecessor of the Anaconda.org channels. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. + +By 2015, several institutes and groups were using Binstar to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK's [Scitools](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. The channel for conda-forge was not created until April 2015 [^binstar-conda-forge], and [Bioconda](https://anaconda.org/bioconda) waited until September of the same year. + +In 2015, Continuum Analytics rebranded as Anaconda Inc, and Binstar.org became Anaconda.org. + +## How conda-forge came to be + +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) get in touch [^chatting-ocefpaf]. They are maintaining the Binstar channels for IOOS and Scitools, respectively. + + + +## References + +[^cgohlke-shutdown]: [Christoph Gohlke's Windows Wheels site is shutting down by the end of the month](https://www.reddit.com/r/Python/comments/vcaibq/christoph_gohlkes_windows_wheels_site_is_shutting/), 2022. + +[^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. + +[^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. + +[^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. + +[^early-conda-build-docs]: [Conda build framework documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. + +[^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) + +[^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. + +[^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. + +[^binstar-scitools]: https://anaconda.org/scitools, 2014. + +[^binstar-ioos]: https://anaconda.org/ioos, 2014. + +[^binstar-omnia]: https://anaconda.org/omnia, 2014. + +[^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. + +[^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. + +[^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) + +[^eggs]: [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). + +[^legacy-python-downloads]: [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). diff --git a/community/index.md b/community/index.md index 9c3a3fbab8..dad8f6b50e 100644 --- a/community/index.md +++ b/community/index.md @@ -1,5 +1,5 @@ --- -title: 'conda-forge community' +title: 'The conda-forge community' --- # Community From bea0898d544bb3b960aa8fb198957dff70e632c6 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 10:18:40 +0200 Subject: [PATCH 02/33] pre-commit --- community/history.md | 15 --------------- 1 file changed, 15 deletions(-) diff --git a/community/history.md b/community/history.md index 6b1159b5b0..e2f09a8617 100644 --- a/community/history.md +++ b/community/history.md @@ -29,33 +29,18 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso ## References [^cgohlke-shutdown]: [Christoph Gohlke's Windows Wheels site is shutting down by the end of the month](https://www.reddit.com/r/Python/comments/vcaibq/christoph_gohlkes_windows_wheels_site_is_shutting/), 2022. - [^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. - [^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. - [^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. - [^early-conda-build-docs]: [Conda build framework documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. - [^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) - [^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. - [^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. - [^binstar-scitools]: https://anaconda.org/scitools, 2014. - [^binstar-ioos]: https://anaconda.org/ioos, 2014. - [^binstar-omnia]: https://anaconda.org/omnia, 2014. - [^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. - [^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. - [^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) - [^eggs]: [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). - [^legacy-python-downloads]: [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). From 4e22a386dcc1641573fe13d07ca0d897cb1a87e9 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 11:02:43 +0200 Subject: [PATCH 03/33] Address some feedback about the early history Co-authored-by: h-vetinari Co-authored-by: pelson Co-authored-by: Jason K. Moore --- community/history.md | 15 ++++++++++++--- 1 file changed, 12 insertions(+), 3 deletions(-) diff --git a/community/history.md b/community/history.md index e2f09a8617..26640d469d 100644 --- a/community/history.md +++ b/community/history.md @@ -6,13 +6,18 @@ title: History conda-forge's origins cannot be explained without understanding the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, specially on Windows, as it often meant compiling dependencies from source. -Python 2.x was the norm, the community was transitioning from `easy_install` to `pip`, and there wouldn't be an alternative for Python eggs [^eggs] until 2012, when wheels are introduced [^wheels]. To get Python, you'd get the official installers from Python.org, stick to the system provided one in Linux, or resort to ActiveState's or Enthought's distributions in macOS and Windows [^legacy-python-downloads]. +Python 2.x was the norm. To install it, you'd get the official installers from Python.org, stick to the system provided one in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads]. + +If you wanted to install additional packages, you would find that the community was transitioning from `easy_install` to `pip`, and there wouldn't be an alternative for Python eggs [^eggs] until 2013, when wheels are formalized [^wheels]. +Realistically, you would have to wait until 2016, when `manylinux` wheels were introduced. For Windows, + +Before then, there was no easy way to ship pre-compiled Python packages: you would need to compile from source. If you were on Windows, Christoph Gohlke's wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. ## The origins of `conda` In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks needed [^packaging-and-deployment-with-conda] [^lex-fridman-podcast]. -In constrast with Python eggs and wheels, conda packages were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them under each Python package. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". +Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them under each Python package. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release is announced. By the end of the year, Continuum Analytics announces Binstar.org, the predecessor of the Anaconda.org channels. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. @@ -28,7 +33,8 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso ## References -[^cgohlke-shutdown]: [Christoph Gohlke's Windows Wheels site is shutting down by the end of the month](https://www.reddit.com/r/Python/comments/vcaibq/christoph_gohlkes_windows_wheels_site_is_shutting/), 2022. +[^cgohlke]: https://www.cgohlke.com/, 2025. +[^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), 2022. [^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. [^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. [^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. @@ -44,3 +50,6 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso [^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) [^eggs]: [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). [^legacy-python-downloads]: [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). +[^pythonxy]: https://python-xy.github.io/, 2015. +[^activepython]: https://www.activestate.com/platform/supported-languages/python/ +[^enthought]: https://docs.enthought.com/canopy/ From 2af512e8acad59145ef91be1857e225cc605650c Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 11:05:04 +0200 Subject: [PATCH 04/33] Rebrand was in 2017 Co-authored-by: Jonathan Helmus --- community/history.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index 26640d469d..d5d6c2090f 100644 --- a/community/history.md +++ b/community/history.md @@ -23,7 +23,7 @@ By June 2013, conda is using a SAT solver and includes the `conda build` subcomm By 2015, several institutes and groups were using Binstar to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK's [Scitools](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. The channel for conda-forge was not created until April 2015 [^binstar-conda-forge], and [Bioconda](https://anaconda.org/bioconda) waited until September of the same year. -In 2015, Continuum Analytics rebranded as Anaconda Inc, and Binstar.org became Anaconda.org. +In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. ## How conda-forge came to be @@ -53,3 +53,4 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso [^pythonxy]: https://python-xy.github.io/, 2015. [^activepython]: https://www.activestate.com/platform/supported-languages/python/ [^enthought]: https://docs.enthought.com/canopy/ +[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. \ No newline at end of file From d16498561225fd579193e2e17ac99ec6d4fce77f Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 11:05:59 +0200 Subject: [PATCH 05/33] Clarify Scitools Co-authored-by: pelson --- community/history.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index d5d6c2090f..3652ef7332 100644 --- a/community/history.md +++ b/community/history.md @@ -21,7 +21,7 @@ Conda packages were not only able to ship pre-compiled Python packages across pl By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release is announced. By the end of the year, Continuum Analytics announces Binstar.org, the predecessor of the Anaconda.org channels. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. -By 2015, several institutes and groups were using Binstar to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK's [Scitools](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. The channel for conda-forge was not created until April 2015 [^binstar-conda-forge], and [Bioconda](https://anaconda.org/bioconda) waited until September of the same year. +By 2015, several institutes and groups were using Binstar to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. The channel for conda-forge was not created until April 2015 [^binstar-conda-forge], and [Bioconda](https://anaconda.org/bioconda) waited until September of the same year. In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. From a3c6f5f1f0b391c468ebe67c3c77699ce58ea025 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 11:14:52 +0200 Subject: [PATCH 06/33] add quote from Travis Co-authored-by: h-vetinari --- community/history.md | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/community/history.md b/community/history.md index 3652ef7332..e6e7cacc30 100644 --- a/community/history.md +++ b/community/history.md @@ -15,9 +15,13 @@ Before then, there was no easy way to ship pre-compiled Python packages: you wou ## The origins of `conda` -In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks needed [^packaging-and-deployment-with-conda] [^lex-fridman-podcast]. +In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks neede [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]: -Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them under each Python package. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". +Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): + +> [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD. We thought hard about the fundamental issues, and created the conda package manager and conda environments. + +Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release is announced. By the end of the year, Continuum Analytics announces Binstar.org, the predecessor of the Anaconda.org channels. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. @@ -53,4 +57,5 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso [^pythonxy]: https://python-xy.github.io/, 2015. [^activepython]: https://www.activestate.com/platform/supported-languages/python/ [^enthought]: https://docs.enthought.com/canopy/ -[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. \ No newline at end of file +[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. +[^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. From a58f10011659a8ea4cb6ce10f0cbc1b0cea45f9d Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 11:22:39 +0200 Subject: [PATCH 07/33] Add link to binstar announcement in scipy Co-authored-by: Jonathan J. Helmus --- community/history.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index e6e7cacc30..e2c57145e4 100644 --- a/community/history.md +++ b/community/history.md @@ -23,7 +23,7 @@ Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.co Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". -By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release is announced. By the end of the year, Continuum Analytics announces Binstar.org, the predecessor of the Anaconda.org channels. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. +By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, are announced. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. By 2015, several institutes and groups were using Binstar to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. The channel for conda-forge was not created until April 2015 [^binstar-conda-forge], and [Bioconda](https://anaconda.org/bioconda) waited until September of the same year. @@ -59,3 +59,4 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso [^enthought]: https://docs.enthought.com/canopy/ [^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. [^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. +[^binstar]: [SciPy 2013 Lightning Talks, Thu June 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). From 352d669404053349c71b069058eb5d2867974a94 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 11:50:44 +0200 Subject: [PATCH 08/33] More content in "how conda-forge came to be" Co-authored-by: pelson --- community/history.md | 43 +++++++++++++++++++++++++------------------ 1 file changed, 25 insertions(+), 18 deletions(-) diff --git a/community/history.md b/community/history.md index e2c57145e4..c44c97828f 100644 --- a/community/history.md +++ b/community/history.md @@ -23,40 +23,47 @@ Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.co Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". -By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda], along with the concept of recipes [^conda-recipes-repo] [^early-conda-build-docs]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, are announced. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. +By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, are announced. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. -By 2015, several institutes and groups were using Binstar to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. The channel for conda-forge was not created until April 2015 [^binstar-conda-forge], and [Bioconda](https://anaconda.org/bioconda) waited until September of the same year. +With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository was _the_ +place where people would contribute their conda recipes. It was really successful, but the recipes were of various quality, and typically only worked on one or two platforms. There was a high chance that a recipe you found there would no longer build, and you had to tweak it to get it to work. In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. ## How conda-forge came to be -In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) get in touch [^chatting-ocefpaf]. They are maintaining the Binstar channels for IOOS and Scitools, respectively. +By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. + +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had implemented CI pipelines and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups. It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. + +Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. + ## References -[^cgohlke]: https://www.cgohlke.com/, 2025. -[^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), 2022. +[^activepython]: https://www.activestate.com/platform/supported-languages/python/ [^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. -[^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. -[^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. -[^early-conda-build-docs]: [Conda build framework documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. -[^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) -[^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. -[^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. -[^binstar-scitools]: https://anaconda.org/scitools, 2014. +[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. +[^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. [^binstar-ioos]: https://anaconda.org/ioos, 2014. [^binstar-omnia]: https://anaconda.org/omnia, 2014. -[^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. +[^binstar-scitools]: https://anaconda.org/scitools, 2014. +[^binstar]: [SciPy 2013 Lightning Talks, Thu June 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). +[^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), 2022. +[^cgohlke]: https://www.cgohlke.com/, 2025. [^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. -[^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) +[^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. +[^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) +[^early-conda-build-docs]: [Conda build framework documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. [^eggs]: [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). +[^enthought]: https://docs.enthought.com/canopy/ +[^github-api-conda-forge]: https://api.github.com/orgs/conda-forge [^legacy-python-downloads]: [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). +[^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. +[^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. +[^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. [^pythonxy]: https://python-xy.github.io/, 2015. -[^activepython]: https://www.activestate.com/platform/supported-languages/python/ -[^enthought]: https://docs.enthought.com/canopy/ -[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. [^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. -[^binstar]: [SciPy 2013 Lightning Talks, Thu June 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). +[^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) From 96d4d5c15a82d3507f5d4b96a084f434cf1c8614 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 12:02:45 +0200 Subject: [PATCH 09/33] More details about the state of wheels in 2013 --- community/history.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/community/history.md b/community/history.md index c44c97828f..fbd99d56fe 100644 --- a/community/history.md +++ b/community/history.md @@ -8,10 +8,11 @@ conda-forge's origins cannot be explained without understanding the context of P Python 2.x was the norm. To install it, you'd get the official installers from Python.org, stick to the system provided one in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads]. -If you wanted to install additional packages, you would find that the community was transitioning from `easy_install` to `pip`, and there wouldn't be an alternative for Python eggs [^eggs] until 2013, when wheels are formalized [^wheels]. -Realistically, you would have to wait until 2016, when `manylinux` wheels were introduced. For Windows, +If you wanted to install additional packages, you would find that the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. There wouldn't be an alternative for Python eggs [^eggs] until 2013, when wheels are formalized [^wheels]. These were useful for Windows, where Christoph Gohlke's wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. -Before then, there was no easy way to ship pre-compiled Python packages: you would need to compile from source. If you were on Windows, Christoph Gohlke's wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. +However, for Linux, you would have to wait until 2016, when `manylinux` wheels were introduced. Before then PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. + +As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The build distributions section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. But if you pay attention, the `manylinux1` wheels were not uploaded until Apr 2016. No mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! ## The origins of `conda` @@ -38,7 +39,6 @@ In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elso Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. - ## References From 17e59820ae84f5e5c3cc4b2560e87943acd6f5b8 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 12:09:56 +0200 Subject: [PATCH 10/33] Mention Gohlke and Cournapeau Co-authored-by: pelson --- community/history.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/community/history.md b/community/history.md index fbd99d56fe..e900f7af19 100644 --- a/community/history.md +++ b/community/history.md @@ -35,9 +35,9 @@ In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebran By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. -In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had implemented CI pipelines and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups. It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had implemented CI pipelines and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, very often with the assistance of folks in other communities. For example, Christophe Gohlke and David Cournapeau were instrumental to get Windows builds of the whole SciPy stack working on AppVeyor. -Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. +It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. From ca6a3b41684359e3ff49bfa0cf43bd98f4d02fb9 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 12:39:37 +0200 Subject: [PATCH 11/33] Add link to SciTool's .travis.yml --- community/history.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index e900f7af19..af5e954c50 100644 --- a/community/history.md +++ b/community/history.md @@ -35,7 +35,7 @@ In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebran By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. -In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had implemented CI pipelines and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, very often with the assistance of folks in other communities. For example, Christophe Gohlke and David Cournapeau were instrumental to get Windows builds of the whole SciPy stack working on AppVeyor. +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, very often with the assistance of folks in other communities. For example, Christophe Gohlke and David Cournapeau were instrumental to get Windows builds of the whole SciPy stack working on AppVeyor. It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. From 6e8b282848fae3d9c28bb8eac6816b823be99cf9 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 13:49:29 +0200 Subject: [PATCH 12/33] typo --- community/history.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index af5e954c50..fe08405182 100644 --- a/community/history.md +++ b/community/history.md @@ -16,7 +16,7 @@ As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https:/ ## The origins of `conda` -In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks neede [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]: +In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): From 1583e8ba0abbe4fc571d787c46e72e464add4856 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 14:08:03 +0200 Subject: [PATCH 13/33] Improve language and grammar --- community/history.md | 26 +++++++++++++------------- 1 file changed, 13 insertions(+), 13 deletions(-) diff --git a/community/history.md b/community/history.md index fe08405182..0e5f449e23 100644 --- a/community/history.md +++ b/community/history.md @@ -4,40 +4,40 @@ title: History # History of conda-forge -conda-forge's origins cannot be explained without understanding the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, specially on Windows, as it often meant compiling dependencies from source. +conda-forge's origins are best understood in the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, especially on Windows, as it often meant compiling dependencies from source. -Python 2.x was the norm. To install it, you'd get the official installers from Python.org, stick to the system provided one in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads]. +Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads]. -If you wanted to install additional packages, you would find that the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. There wouldn't be an alternative for Python eggs [^eggs] until 2013, when wheels are formalized [^wheels]. These were useful for Windows, where Christoph Gohlke's wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. +If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to Python eggs [^eggs] wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. -However, for Linux, you would have to wait until 2016, when `manylinux` wheels were introduced. Before then PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. +However, for Linux, you would have to wait until 2016, when `manylinux` wheels were introduced. Before then, PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. -As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The build distributions section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. But if you pay attention, the `manylinux1` wheels were not uploaded until Apr 2016. No mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! +As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The build distributions section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! ## The origins of `conda` -In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. +In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference [^anaconda-history]. Later that year, in September, Continuum released `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and NumPy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): > [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD. We thought hard about the fundamental issues, and created the conda package manager and conda environments. -Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". +Conda packages could not only ship pre-compiled Python packages across platforms but were also agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". -By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, are announced. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost. +By June 2013, conda was using a SAT solver and included the `conda build` subcommand [^new-advances-in-conda]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, were announced. This meant that any user could build their software stack as conda packages and redistribute them online at no cost. -With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository was _the_ -place where people would contribute their conda recipes. It was really successful, but the recipes were of various quality, and typically only worked on one or two platforms. There was a high chance that a recipe you found there would no longer build, and you had to tweak it to get it to work. +With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository became _the_ central +place where people would contribute their conda recipes. While successful, the recipes varied in quality, and typically only worked on one or two platforms. It was common to find recipes that would no longer build, and you had to tweak it to get it to work. In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. ## How conda-forge came to be -By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. +By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], the UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. -In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, very often with the assistance of folks in other communities. For example, Christophe Gohlke and David Cournapeau were instrumental to get Windows builds of the whole SciPy stack working on AppVeyor. +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, often assisted by members of other communities. For example, Christophe Gohlke and David Cournapeau were instrumental in getting Windows builds of the whole SciPy stack to work on AppVeyor. -It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. +It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. From d881fe5c494326f65eb5092ce6a2518cfe2091c6 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Fri, 11 Apr 2025 14:19:56 +0200 Subject: [PATCH 14/33] add manylinux link --- community/history.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/community/history.md b/community/history.md index 0e5f449e23..77b40b9a28 100644 --- a/community/history.md +++ b/community/history.md @@ -8,11 +8,11 @@ conda-forge's origins are best understood in the context of Python packaging bac Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads]. -If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to Python eggs [^eggs] wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. +If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to Python eggs [^eggs] wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's exes and wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. -However, for Linux, you would have to wait until 2016, when `manylinux` wheels were introduced. Before then, PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. +However, for Linux, you would have to wait until 2016, when [`manylinux` wheels were introduced](https://peps.python.org/pep-0513/). Before then, PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. -As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The build distributions section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! +As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! ## The origins of `conda` From 5cf119e1a47d893d651229b764dcf58f24c4e2fa Mon Sep 17 00:00:00 2001 From: Mike Sarahan Date: Mon, 14 Apr 2025 10:56:49 -0500 Subject: [PATCH 15/33] flesh out continuum history (#1) --- community/history.md | 32 ++++++++++++++++++++++++-------- 1 file changed, 24 insertions(+), 8 deletions(-) diff --git a/community/history.md b/community/history.md index 77b40b9a28..f32bb3212a 100644 --- a/community/history.md +++ b/community/history.md @@ -2,11 +2,11 @@ title: History --- -# History of conda-forge +# Context of binary packaging for Python conda-forge's origins are best understood in the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, especially on Windows, as it often meant compiling dependencies from source. -Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads]. +Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions (EPD, later Canopy) [^enthought] in macOS and Windows [^legacy-python-downloads]. If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to Python eggs [^eggs] wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's exes and wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. @@ -14,30 +14,46 @@ However, for Linux, you would have to wait until 2016, when [`manylinux` wheels As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! +The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is binary compatibility. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as glibc, and dependency libraries present on the build machine are compatible on destination systems. Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. + ## The origins of `conda` -In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference [^anaconda-history]. Later that year, in September, Continuum released `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and NumPy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. +In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference [^anaconda-history]. Anaconda was a distribution of scientifically-oriented packages, but did not yet have tools for managing individual packages. Later that year, in September, Continuum released `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and NumPy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): -> [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD. We thought hard about the fundamental issues, and created the conda package manager and conda environments. +> [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD [Enthought Python Distribution]. We thought hard about the fundamental issues, and created the conda package manager and conda environments. Conda packages could not only ship pre-compiled Python packages across platforms but were also agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". -By June 2013, conda was using a SAT solver and included the `conda build` subcommand [^new-advances-in-conda]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, were announced. This meant that any user could build their software stack as conda packages and redistribute them online at no cost. +By June 2013, conda was using a SAT solver and included the `conda build` tool [^new-advances-in-conda] for community users outside of Continuum to build their own conda packages. This is also when the first Miniconda release and Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, were announced. Miniconda provided a minimal base environment that users could populate themselves, and Binstar.org gave any user an easy platform for redistributing their packages. All of the conda tools and Binstar/Anaconda.org have been free (as in beer), with some paid options on Binstar/Anaconda.org for more storage. With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository became _the_ central -place where people would contribute their conda recipes. While successful, the recipes varied in quality, and typically only worked on one or two platforms. It was common to find recipes that would no longer build, and you had to tweak it to get it to work. +place where people would contribute their conda recipes. This was separate from Anaconda's package recipes, which were private at this point. While successful, the recipes varied in quality, and typically only worked on one or two platforms. There was no CI for any recipes to help keep them working. It was common to find recipes that would no longer build, and you had to tweak it to get it to work. In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. ## How conda-forge came to be -By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], the UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. +By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], the UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. Although each channel was building conda packages, the binary compatibility between channels was unpredictable. In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, often assisted by members of other communities. For example, Christophe Gohlke and David Cournapeau were instrumental in getting Windows builds of the whole SciPy stack to work on AppVeyor. -It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. +There was a lot of cross-pollination between projects/channels, but working in separate repos, duplicated recipes, and differing build toolchains. Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. + +## Meanwhile at Continuum + +It's a little strange to describe Continuum/Anaconda's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the "free" channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer led the conda and conda-build projects, contributed many recipes to the conda-recipes repository and built many packages on his "asmeurer" binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for conda, but conda-build was not as much of a priority for him. Michael Sarahan stepped in to maintain Conda-build. + +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in Conda-Forge. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. This reliance was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. + +Around this point in time, GCC 5 arrived with a breaking change in libstdc++. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's devtoolset. This used an older GCC version which statically linked the newer libstdc++ parts into binaries, so that libstdc++ updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer libstdc++ library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. + +As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Continuum was working on two contracts that would prove revolutionary. Samsung wanted to use Conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Continuum would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Continuum would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Continuum aligned, which equates to smooth operation and happy users. + + + + From 271b95ec26860ac3de42913db268aefc4a587015 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Mon, 14 Apr 2025 18:13:31 +0200 Subject: [PATCH 16/33] cosmetic adjustments --- community/history.md | 22 ++++++++++++++++------ 1 file changed, 16 insertions(+), 6 deletions(-) diff --git a/community/history.md b/community/history.md index f32bb3212a..d084fc1b5e 100644 --- a/community/history.md +++ b/community/history.md @@ -2,7 +2,11 @@ title: History --- -# Context of binary packaging for Python +# History of conda-forge + + + +## Context of binary packaging for Python conda-forge's origins are best understood in the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, especially on Windows, as it often meant compiling dependencies from source. @@ -14,7 +18,7 @@ However, for Linux, you would have to wait until 2016, when [`manylinux` wheels As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! -The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is binary compatibility. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as glibc, and dependency libraries present on the build machine are compatible on destination systems. Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. +The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is binary compatibility. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as `glibc`, and dependency libraries present on the build machine are compatible on destination systems. Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. ## The origins of `conda` @@ -43,13 +47,15 @@ There was a lot of cross-pollination between projects/channels, but working in s ## Meanwhile at Continuum -It's a little strange to describe Continuum/Anaconda's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the "free" channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer led the conda and conda-build projects, contributed many recipes to the conda-recipes repository and built many packages on his "asmeurer" binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for conda, but conda-build was not as much of a priority for him. Michael Sarahan stepped in to maintain Conda-build. +It's a little strange to describe Continuum Analytics/Anaconda's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the [`free` channel][free-channel], which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. -In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in Conda-Forge. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. This reliance was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. +Concurrently, Aaron Meurer led the `conda` and `conda-build` projects, contributed many recipes to the `conda-recipes` repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for `conda`, but `conda-build` was not as much of a priority for him. Michael Sarahan stepped in to maintain `conda-build`. -Around this point in time, GCC 5 arrived with a breaking change in libstdc++. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's devtoolset. This used an older GCC version which statically linked the newer libstdc++ parts into binaries, so that libstdc++ updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer libstdc++ library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in conda-forge. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. This reliance was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of `conda-forge` and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. -As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Continuum was working on two contracts that would prove revolutionary. Samsung wanted to use Conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Continuum would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Continuum would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Continuum aligned, which equates to smooth operation and happy users. +Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. + +As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Continuum was working on two contracts that would prove revolutionary. Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to `conda-build` that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Continuum would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Continuum would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these `conda-forge`-based recipes and the new toolchain is what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Continuum aligned, which equated to smooth operation and happy users. @@ -83,3 +89,7 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr [^pythonxy]: https://python-xy.github.io/, 2015. [^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. [^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) + + +[free-channel]: https://anaconda.org/free +[gcc-5]: https://gcc.gnu.org/gcc-5/changes.html From 8a9e5b9724ebb2c390a2322e98201f29d31a92a5 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Mon, 14 Apr 2025 18:14:30 +0200 Subject: [PATCH 17/33] pre-commit --- community/history.md | 1 + 1 file changed, 1 insertion(+) diff --git a/community/history.md b/community/history.md index d084fc1b5e..2b4735e7f3 100644 --- a/community/history.md +++ b/community/history.md @@ -91,5 +91,6 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr [^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) + [free-channel]: https://anaconda.org/free [gcc-5]: https://gcc.gnu.org/gcc-5/changes.html From 9930e5b5175c5143c28e62163499624e29c88521 Mon Sep 17 00:00:00 2001 From: Michael Sarahan Date: Mon, 14 Apr 2025 11:18:13 -0500 Subject: [PATCH 18/33] fix lost save --- community/history.md | 14 ++++++++------ 1 file changed, 8 insertions(+), 6 deletions(-) diff --git a/community/history.md b/community/history.md index d084fc1b5e..5888445cd9 100644 --- a/community/history.md +++ b/community/history.md @@ -30,7 +30,7 @@ Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.co Conda packages could not only ship pre-compiled Python packages across platforms but were also agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". -By June 2013, conda was using a SAT solver and included the `conda build` tool [^new-advances-in-conda] for community users outside of Continuum to build their own conda packages. This is also when the first Miniconda release and Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, were announced. Miniconda provided a minimal base environment that users could populate themselves, and Binstar.org gave any user an easy platform for redistributing their packages. All of the conda tools and Binstar/Anaconda.org have been free (as in beer), with some paid options on Binstar/Anaconda.org for more storage. +By June 2013, conda was using a SAT solver and included the `conda build` tool [^new-advances-in-conda] for community users outside of Continuum to build their own conda packages. This is also when the first Miniconda release and Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, were announced. Miniconda provided a minimal base environment that users could populate themselves, and Binstar.org gave any user an easy platform for redistributing their packages. All of the conda tools have been free (BSD 3-clause) and Binstar/Anaconda.org was also free (as in beer, not open source), with some paid options on Binstar/Anaconda.org for more storage. With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository became _the_ central place where people would contribute their conda recipes. This was separate from Anaconda's package recipes, which were private at this point. While successful, the recipes varied in quality, and typically only worked on one or two platforms. There was no CI for any recipes to help keep them working. It was common to find recipes that would no longer build, and you had to tweak it to get it to work. @@ -43,19 +43,21 @@ By 2015, several institutes and groups were using Binstar/Anaconda.org to distri In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, often assisted by members of other communities. For example, Christophe Gohlke and David Cournapeau were instrumental in getting Windows builds of the whole SciPy stack to work on AppVeyor. -There was a lot of cross-pollination between projects/channels, but working in separate repos, duplicated recipes, and differing build toolchains. Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. +There was a lot of cross-pollination between projects/channels, but projects were still working in separate repos, duplicating recipes, and with differing build toolchains (so mixing channels was unpredictable). Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. ## Meanwhile at Continuum -It's a little strange to describe Continuum Analytics/Anaconda's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the [`free` channel][free-channel], which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. +Reminder: Continuum is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. -Concurrently, Aaron Meurer led the `conda` and `conda-build` projects, contributed many recipes to the `conda-recipes` repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for `conda`, but `conda-build` was not as much of a priority for him. Michael Sarahan stepped in to maintain `conda-build`. +It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer led the conda and conda-build projects, contributed many recipes to the conda-recipes repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for conda, but conda-build was not as much of a priority for him. Michael Sarahan stepped in to maintain Conda-build. -In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in conda-forge. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. This reliance was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of `conda-forge` and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in Conda-Forge. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the conda-forge channel had far fewer packages than the `defaults` channel. Conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. -As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Continuum was working on two contracts that would prove revolutionary. Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to `conda-build` that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Continuum would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Continuum would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these `conda-forge`-based recipes and the new toolchain is what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Continuum aligned, which equated to smooth operation and happy users. +Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from here out. + +As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Anaconda was working on two contracts that would prove revolutionary. Samsung wanted to use Conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. From d0a4839626a866caa901f44ab00d6b51f07beff5 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Mon, 14 Apr 2025 18:38:12 +0200 Subject: [PATCH 19/33] add link to talk python episode #94 --- community/history.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index 091c4c511b..b652631f79 100644 --- a/community/history.md +++ b/community/history.md @@ -43,7 +43,7 @@ By 2015, several institutes and groups were using Binstar/Anaconda.org to distri In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, often assisted by members of other communities. For example, Christophe Gohlke and David Cournapeau were instrumental in getting Windows builds of the whole SciPy stack to work on AppVeyor. -There was a lot of cross-pollination between projects/channels, but projects were still working in separate repos, duplicating recipes, and with differing build toolchains (so mixing channels was unpredictable). Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. +There was a lot of cross-pollination between projects/channels, but projects were still working in separate repos, duplicating recipes, and with differing build toolchains (so mixing channels was unpredictable). Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient community-driven collaboration under a single umbrella [^talkpython-conda]. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. ## Meanwhile at Continuum @@ -89,6 +89,7 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr [^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. [^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. [^pythonxy]: https://python-xy.github.io/, 2015. +[^talkpython-conda]: [Guaranteed packages via Conda and Conda-Forge](https://talkpython.fm/episodes/show/94/guarenteed-packages-via-conda-and-conda-forge), 2016. [^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. [^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) From b0135c7cb6afdd6e84963828048a8c3f64860781 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Mon, 14 Apr 2025 18:42:18 +0200 Subject: [PATCH 20/33] reapply cosmetic changes --- community/history.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/community/history.md b/community/history.md index b652631f79..ab234efda6 100644 --- a/community/history.md +++ b/community/history.md @@ -47,17 +47,17 @@ There was a lot of cross-pollination between projects/channels, but projects wer ## Meanwhile at Continuum -Reminder: Continuum is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. +> Reminder: Continuum Analytics is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. -It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer led the conda and conda-build projects, contributed many recipes to the conda-recipes repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for conda, but conda-build was not as much of a priority for him. Michael Sarahan stepped in to maintain Conda-build. +It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer led the conda and conda-build projects, contributed many recipes to the conda-recipes repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much of a priority for him. Michael Sarahan stepped in to maintain `conda-build`. -In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in Conda-Forge. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the conda-forge channel had far fewer packages than the `defaults` channel. Conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from here out. -As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Anaconda was working on two contracts that would prove revolutionary. Samsung wanted to use Conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. +As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Anaconda was working on two contracts that would prove revolutionary. Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. From e43258bbf63be34eb9bcc8391a387756f11de68b Mon Sep 17 00:00:00 2001 From: jaimergp Date: Mon, 14 Apr 2025 18:46:59 +0200 Subject: [PATCH 21/33] pre-commit --- community/history.md | 24 ++++++++++++++++++++++++ 1 file changed, 24 insertions(+) diff --git a/community/history.md b/community/history.md index ab234efda6..3c6c42e656 100644 --- a/community/history.md +++ b/community/history.md @@ -68,29 +68,53 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr ## References [^activepython]: https://www.activestate.com/platform/supported-languages/python/ + [^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. + [^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. + [^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. + [^binstar-ioos]: https://anaconda.org/ioos, 2014. + [^binstar-omnia]: https://anaconda.org/omnia, 2014. + [^binstar-scitools]: https://anaconda.org/scitools, 2014. + [^binstar]: [SciPy 2013 Lightning Talks, Thu June 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). + [^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), 2022. + [^cgohlke]: https://www.cgohlke.com/, 2025. + [^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. + [^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. + [^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) + [^early-conda-build-docs]: [Conda build framework documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. + [^eggs]: [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). + [^enthought]: https://docs.enthought.com/canopy/ + [^github-api-conda-forge]: https://api.github.com/orgs/conda-forge + [^legacy-python-downloads]: [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). + [^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. + [^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. + [^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. + [^pythonxy]: https://python-xy.github.io/, 2015. + [^talkpython-conda]: [Guaranteed packages via Conda and Conda-Forge](https://talkpython.fm/episodes/show/94/guarenteed-packages-via-conda-and-conda-forge), 2016. + [^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. + [^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) From 2f8aaa3fd4257b0078f318c4208ca4b1520176d4 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Mon, 14 Apr 2025 18:53:56 +0200 Subject: [PATCH 22/33] add github users --- community/history.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/community/history.md b/community/history.md index 3c6c42e656..7af265419c 100644 --- a/community/history.md +++ b/community/history.md @@ -49,9 +49,9 @@ There was a lot of cross-pollination between projects/channels, but projects wer > Reminder: Continuum Analytics is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. -It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer led the conda and conda-build projects, contributed many recipes to the conda-recipes repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much of a priority for him. Michael Sarahan stepped in to maintain `conda-build`. +It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell ([@ilanschnell](https://github.com/ilanschnell))) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer ([@asmeurer](https://github.com/asmeurer)) led the `conda` and `conda-build` projects, contributed many recipes to the `conda-recipes` repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz ([@kalefranz](https://github.com/kalefranz)) to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much of a priority for him. Michael Sarahan ([@msarahan](https://github.com/msarahan)) stepped in to maintain `conda-build`. -In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. From a0c439c484f6c9215b72bdd171fcd3298efcd1e8 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 02:37:06 +0200 Subject: [PATCH 23/33] Update community/history.md Co-authored-by: James A. Bednar --- community/history.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index 7af265419c..0a68c0be80 100644 --- a/community/history.md +++ b/community/history.md @@ -12,7 +12,7 @@ conda-forge's origins are best understood in the context of Python packaging bac Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions (EPD, later Canopy) [^enthought] in macOS and Windows [^legacy-python-downloads]. -If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to Python eggs [^eggs] wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's exes and wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. +If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to the Python eggs [^eggs] used by `easy_install` wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's exes and wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. However, for Linux, you would have to wait until 2016, when [`manylinux` wheels were introduced](https://peps.python.org/pep-0513/). Before then, PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. From 734d2b6d6265e00c15e61986144f692797d12684 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 02:39:46 +0200 Subject: [PATCH 24/33] link to pypackaging-native for abi explanation Co-authored-by: h-vetinari --- community/history.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index 0a68c0be80..b163a1d8f8 100644 --- a/community/history.md +++ b/community/history.md @@ -18,7 +18,7 @@ However, for Linux, you would have to wait until 2016, when [`manylinux` wheels As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! -The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is binary compatibility. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as `glibc`, and dependency libraries present on the build machine are compatible on destination systems. Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. +The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is [binary compatibility][abi]. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as `glibc`, and dependency libraries present on the build machine are compatible on destination systems. Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. ## The origins of `conda` @@ -119,5 +119,6 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr +[abi]: https://pypackaging-native.github.io/background/binary_interface/ [free-channel]: https://anaconda.org/free [gcc-5]: https://gcc.gnu.org/gcc-5/changes.html From 65f25b7bb33b75fd4938c28bab092b0cb005d8f8 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 03:07:16 +0200 Subject: [PATCH 25/33] mention CB3 and gcc 7 --- community/history.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index b163a1d8f8..3dc6ce98a5 100644 --- a/community/history.md +++ b/community/history.md @@ -53,7 +53,9 @@ It's a little strange to describe Continuum's history here, but the company hist In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. -Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. + +The result of this overhaul crystallized in the `compiler(...)` Jinja function in `conda-build` 3.x and the publication of the GCC 7 toolchain built from source in `defaults` [^anaconda-compilers]. + Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from here out. From 9f1463b687a0a8c692e110bb44940d7d79658a34 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 03:09:19 +0200 Subject: [PATCH 26/33] break up long paragraphs a bit and add more sections --- community/history.md | 43 +++++++++++++++++++++++++++++++++++++------ 1 file changed, 37 insertions(+), 6 deletions(-) diff --git a/community/history.md b/community/history.md index 3dc6ce98a5..0bd4182a8b 100644 --- a/community/history.md +++ b/community/history.md @@ -18,7 +18,9 @@ However, for Linux, you would have to wait until 2016, when [`manylinux` wheels As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! -The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is [binary compatibility][abi]. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as `glibc`, and dependency libraries present on the build machine are compatible on destination systems. Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. +The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is [binary compatibility][abi]. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as `glibc`, and dependency libraries present on the build machine are compatible on destination systems. + +Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. ## The origins of `conda` @@ -30,7 +32,11 @@ Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.co Conda packages could not only ship pre-compiled Python packages across platforms but were also agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". -By June 2013, conda was using a SAT solver and included the `conda build` tool [^new-advances-in-conda] for community users outside of Continuum to build their own conda packages. This is also when the first Miniconda release and Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, were announced. Miniconda provided a minimal base environment that users could populate themselves, and Binstar.org gave any user an easy platform for redistributing their packages. All of the conda tools have been free (BSD 3-clause) and Binstar/Anaconda.org was also free (as in beer, not open source), with some paid options on Binstar/Anaconda.org for more storage. +By June 2013, conda was using a SAT solver and included the `conda build` tool [^new-advances-in-conda] for community users outside of Continuum to build their own conda packages. This is also when the first Miniconda release and Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, were announced. Miniconda provided a minimal base environment that users could populate themselves, and Binstar.org gave any user an easy platform for redistributing their packages. + +:::note Did you know +All of the conda tools have been free (BSD 3-clause) and Binstar/Anaconda.org was also free (as in beer, not open source), with some paid options on Binstar/Anaconda.org for more storage. +::: With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository became _the_ central place where people would contribute their conda recipes. This was separate from Anaconda's package recipes, which were private at this point. While successful, the recipes varied in quality, and typically only worked on one or two platforms. There was no CI for any recipes to help keep them working. It was common to find recipes that would no longer build, and you had to tweak it to get it to work. @@ -47,19 +53,42 @@ There was a lot of cross-pollination between projects/channels, but projects wer ## Meanwhile at Continuum -> Reminder: Continuum Analytics is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. +:::note Reminder +Continuum Analytics is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. +::: + +It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell ([@ilanschnell](https://github.com/ilanschnell))) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. + +The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer ([@asmeurer](https://github.com/asmeurer)) led the `conda` and `conda-build` projects, contributed many recipes to the `conda-recipes` repository and built many packages on his `asmeurer` binstar.org channel. + +Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz ([@kalefranz](https://github.com/kalefranz)) to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much of a priority for him. Michael Sarahan ([@msarahan](https://github.com/msarahan)) stepped in to maintain `conda-build`. + +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. -It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell ([@ilanschnell](https://github.com/ilanschnell))) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer ([@asmeurer](https://github.com/asmeurer)) led the `conda` and `conda-build` projects, contributed many recipes to the `conda-recipes` repository and built many packages on his `asmeurer` binstar.org channel. Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz ([@kalefranz](https://github.com/kalefranz)) to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much of a priority for him. Michael Sarahan ([@msarahan](https://github.com/msarahan)) stepped in to maintain `conda-build`. +## conda-build 3 and the new compiler toolchain -In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. +There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. +Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. + +There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. + +In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. The result of this overhaul crystallized in the `compiler(...)` Jinja function in `conda-build` 3.x and the publication of the GCC 7 toolchain built from source in `defaults` [^anaconda-compilers]. +## From `free` to `main` Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from here out. -As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Anaconda was working on two contracts that would prove revolutionary. Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as conda-forge, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. +As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Anaconda was working on two contracts that would prove revolutionary. + +- Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. +- Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. + +Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as `conda-forge`, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. + +The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. @@ -71,6 +100,8 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr [^activepython]: https://www.activestate.com/platform/supported-languages/python/ +[^anaconda-compilers]: https://www.anaconda.com/blog/utilizing-the-new-compilers-in-anaconda-distribution-5 + [^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. [^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. From cdb662a8e436c3aa2fbc227f0998c7ee745308df Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 03:20:54 +0200 Subject: [PATCH 27/33] add reference to Anaconda 5.0 release --- community/history.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/community/history.md b/community/history.md index 0bd4182a8b..54f50b60e1 100644 --- a/community/history.md +++ b/community/history.md @@ -86,7 +86,7 @@ As more and more conflicts with `free` channel packages occurred, conda-forge gr - Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. - Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. -Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as `conda-forge`, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel, which was also part of `defaults`. +Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as `conda-forge`, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel [^anaconda-5], which was also part of `defaults`. The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. @@ -100,6 +100,8 @@ The `main` channel represented a major step forward in keeping conda-forge and A [^activepython]: https://www.activestate.com/platform/supported-languages/python/ +[^anaconda-5]: https://www.anaconda.com/blog/announcing-the-release-of-anaconda-distribution-5-0 + [^anaconda-compilers]: https://www.anaconda.com/blog/utilizing-the-new-compilers-in-anaconda-distribution-5 [^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. From 063b395f2c2815c381e74b886fe7d284b04e5823 Mon Sep 17 00:00:00 2001 From: Mike Sarahan Date: Tue, 15 Apr 2025 05:53:52 -0500 Subject: [PATCH 28/33] Line breaks and a couple of new links (#2) --- community/history.md | 261 ++++++++++++++++++++++++++++++++++--------- 1 file changed, 211 insertions(+), 50 deletions(-) diff --git a/community/history.md b/community/history.md index 54f50b60e1..fc07d52afc 100644 --- a/community/history.md +++ b/community/history.md @@ -8,48 +8,133 @@ title: History ## Context of binary packaging for Python -conda-forge's origins are best understood in the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, especially on Windows, as it often meant compiling dependencies from source. - -Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions (EPD, later Canopy) [^enthought] in macOS and Windows [^legacy-python-downloads]. - -If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to the Python eggs [^eggs] used by `easy_install` wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful for Windows, where Christoph Gohlke's exes and wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. - -However, for Linux, you would have to wait until 2016, when [`manylinux` wheels were introduced](https://peps.python.org/pep-0513/). Before then, PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source. - -As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! - -The reason why it is hard to find packages for a specific system, and why compilation was the preferred option for many, is [binary compatibility][abi]. Binary compatibility is a window of compatibility where each combination of compiler version, core libraries such as `glibc`, and dependency libraries present on the build machine are compatible on destination systems. - -Linux distributions achieve this by freezing compiler versions and library versions for a particular release cycle. Windows achieves this relatively easily because Python standardized on particular Visual Studio compiler versions for each Python release. Where a Windows package executable was reliably redistributable across versions of Windows, so long as Python version was the same, Linux presented a more difficult target because it was (and is) so much harder to account for all of the little details that must line up. +conda-forge's origins are best understood in the context of Python packaging +back in the early 2010s. Back then, the installation of Python packages across +operating systems was very challenging, especially on Windows, as it often meant +compiling dependencies from source. + +Python 2.x was the norm. To install it, you'd get the official installers from +Python.org, use the system-provided interpreter in Linux, or resort to options +like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or +Enthought's distributions (EPD, later Canopy) [^enthought] in macOS and Windows +[^legacy-python-downloads]. + +If you wanted to install additional packages, the community was transitioning +from `easy_install` to `pip`, and there was no easy way to ship or install +pre-compiled Python packages. An alternative to the Python eggs [^eggs] used by +`easy_install` wouldn't emerge until 2013 with the formalization of wheels +[^wheels]. These were useful for Windows, where Christoph Gohlke's exes and +wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. + +However, for Linux, you would have to wait until 2016, when [`manylinux` wheels +were introduced](https://peps.python.org/pep-0513/). Before then, PyPI wouldn't +even allow compiled Linux wheels and your only alternative was to compile every +package from source. + +As an example, take a look at the [PyPI download page for `numpy` +1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The +"Built Distributions" section only shows a few `.exe` files for Windows (!), and +some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded +until April 2016. There was no mention whatsoever of macOS. Now compare it to +[`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in +March 2016: wheels for all platforms! + +The reason why it is hard to find packages for a specific system, and why +compilation was the preferred option for many, is [binary compatibility][abi]. +Binary compatibility is a window of compatibility where each combination of +compiler version, core libraries such as `glibc`, and dependency libraries +present on the build machine are compatible on destination systems. + +Linux distributions achieve this by freezing compiler versions and library +versions for a particular release cycle. Windows achieves this relatively easily +because Python standardized on particular Visual Studio compiler versions for +each Python release. Where a Windows package executable was reliably +redistributable across versions of Windows, so long as Python version was the +same, Linux presented a more difficult target because it was (and is) so much +harder to account for all of the little details that must line up. ## The origins of `conda` -In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference [^anaconda-history]. Anaconda was a distribution of scientifically-oriented packages, but did not yet have tools for managing individual packages. Later that year, in September, Continuum released `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and NumPy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. +In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference +[^anaconda-history]. Anaconda was a distribution of scientifically-oriented +packages, but did not yet have tools for managing individual packages. Later +that year, in September, Continuum released `conda` 1.0, the cross-platform, +language-agnostic package manager for pre-compiled artifacts +[^conda-changelog-1.0]. The motivation behind these efforts was to provide an +easy way to ship all the compiled libraries and Python packages that users of +the SciPy and NumPy stacks needed +[^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): -> [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD [Enthought Python Distribution]. We thought hard about the fundamental issues, and created the conda package manager and conda environments. - -Conda packages could not only ship pre-compiled Python packages across platforms but were also agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code". - -By June 2013, conda was using a SAT solver and included the `conda build` tool [^new-advances-in-conda] for community users outside of Continuum to build their own conda packages. This is also when the first Miniconda release and Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, were announced. Miniconda provided a minimal base environment that users could populate themselves, and Binstar.org gave any user an easy platform for redistributing their packages. +> [...] at the first PyData meetup at Google HQ, where several of us asked Guido +what we can do to fix Python packaging for the NumPy stack. Guido's answer was +to "solve the problem ourselves". We at Continuum took him at his word. We +looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our +past experience with EPD [Enthought Python Distribution]. We thought hard about +the fundamental issues, and created the conda package manager and conda +environments. + +Conda packages could not only ship pre-compiled Python packages across platforms +but were also agnostic enough to ship Python itself, as well as the underlying +shared libraries without having to statically vendor them. This was particularly +convenient for projects that relied on both compiled dependencies (e.g. C++ or +Fortran libraries) and Python "glue code". + +By June 2013, conda was using a SAT solver and included the `conda build` tool +[^new-advances-in-conda] for community users outside of Continuum to build their +own conda packages. This is also when the first Miniconda release and +Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, +were announced. Miniconda provided a minimal base environment that users could +populate themselves, and Binstar.org gave any user an easy platform for +redistributing their packages. :::note Did you know All of the conda tools have been free (BSD 3-clause) and Binstar/Anaconda.org was also free (as in beer, not open source), with some paid options on Binstar/Anaconda.org for more storage. ::: -With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository became _the_ central -place where people would contribute their conda recipes. This was separate from Anaconda's package recipes, which were private at this point. While successful, the recipes varied in quality, and typically only worked on one or two platforms. There was no CI for any recipes to help keep them working. It was common to find recipes that would no longer build, and you had to tweak it to get it to work. +With `conda build` came along the concept of recipes [^early-conda-build-docs]. +The +[`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) +repository became _the_ central place where people would contribute their conda +recipes. This was separate from Anaconda's package recipes, which were private +at this point. While successful, the recipes varied in quality, and typically +only worked on one or two platforms. There was no CI for any recipes to help +keep them working. It was common to find recipes that would no longer build, and +you had to tweak it to get it to work. In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. ## How conda-forge came to be -By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], the UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos]. Although each channel was building conda packages, the binary compatibility between channels was unpredictable. - -In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, often assisted by members of other communities. For example, Christophe Gohlke and David Cournapeau were instrumental in getting Windows builds of the whole SciPy stack to work on AppVeyor. - -There was a lot of cross-pollination between projects/channels, but projects were still working in separate repos, duplicating recipes, and with differing build toolchains (so mixing channels was unpredictable). Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient community-driven collaboration under a single umbrella [^talkpython-conda]. On April 11th, 2015, `conda-forge` was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. +By 2015, several institutes and groups were using Binstar/Anaconda.org to +distribute software packages they used daily: the [Omnia Molecular +Dynamics](https://github.com/omnia-md) project started as early as March 2014 +[^binstar-omnia], the UK Met Office supported [SciTools +project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the +[US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started +using it in July 2014 [^binstar-ioos]. Although each channel was building conda +packages, the binary compatibility between channels was unpredictable. + +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil +Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar +channels for IOOS and SciTools, respectively. Phil had [implemented CI +pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) +and [special tooling](https://github.com/conda-tools/conda-build-all) to build +conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There +was also a healthy exchange of recipes between the two groups, often assisted by +members of other communities. For example, Christophe Gohlke and David +Cournapeau were instrumental in getting Windows builds of the whole SciPy stack +to work on AppVeyor. + +There was a lot of cross-pollination between projects/channels, but projects +were still working in separate repos, duplicating recipes, and with differing +build toolchains (so mixing channels was unpredictable). Given the success of +the `ContinuumIO/conda-recipes` repository, it became clear there was a demand +for high quality conda recipes and more efficient community-driven collaboration +under a single umbrella [^talkpython-conda]. On April 11th, 2015, `conda-forge` +was registered as a Github organization [^github-api-conda-forge] and an +Anaconda.org channel [^binstar-conda-forge]. ## Meanwhile at Continuum @@ -57,38 +142,110 @@ There was a lot of cross-pollination between projects/channels, but projects wer Continuum Analytics is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. ::: -It's a little strange to describe Continuum's history here, but the company history is so deeply intertwined with conda-forge that it is essential for a complete story. During this time, Continuum (especially Ilan Schnell ([@ilanschnell](https://github.com/ilanschnell))) was developing its own internal recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and GCC 4.8. These details matter, because they effectively set the compatibility bounds of the entire conda package ecosystem. - -The packages made from these internal recipes were available on the `free` channel, which in turn was part of a metachannel named `defaults`. The `defaults` channel made up the initial channel configuration for the Miniconda and Anaconda installers. Concurrently, Aaron Meurer ([@asmeurer](https://github.com/asmeurer)) led the `conda` and `conda-build` projects, contributed many recipes to the `conda-recipes` repository and built many packages on his `asmeurer` binstar.org channel. - -Aaron left Continuum in late 2015, leaving the community side of the projects in need of new leadership. Continuum hired Kale Franz ([@kalefranz](https://github.com/kalefranz)) to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much of a priority for him. Michael Sarahan ([@msarahan](https://github.com/msarahan)) stepped in to maintain `conda-build`. - -In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in `conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) joined the team at Continuum soon afterwards, bringing extensive experience in package managers and toolchains from his involvement in the MSYS2 project. +It's a little strange to describe Continuum's history here, but the company +history is so deeply intertwined with conda-forge that it is essential for a +complete story. During this time, Continuum (especially Ilan Schnell +([@ilanschnell](https://github.com/ilanschnell))) was developing its own +internal recipes for packages. Continuum's Linux toolchain at the time was based +on CentOS 5 and GCC 4.8. These details matter, because they effectively set the +compatibility bounds of the entire conda package ecosystem. + +The packages made from these internal recipes were available on the `free` +channel, which in turn was part of a metachannel named `defaults`. The +`defaults` channel made up the initial channel configuration for the Miniconda +and Anaconda installers. Concurrently, Aaron Meurer +([@asmeurer](https://github.com/asmeurer)) led the `conda` and `conda-build` +projects, contributed many recipes to the `conda-recipes` repository and built +many packages on his `asmeurer` binstar.org channel. + +Aaron left Continuum in late 2015, leaving the community side of the projects in +need of new leadership. Continuum hired Kale Franz +([@kalefranz](https://github.com/kalefranz)) to fill this role. Kale had huge +ambitions for conda, but `conda-build` was not as much of a priority for him. +Michael Sarahan ([@msarahan](https://github.com/msarahan)) stepped in to +maintain `conda-build`. + +In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at +Continuum, who assigned Michael Sarahan to be Continuum's representative in +`conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) +joined the team at Continuum soon afterwards, bringing extensive experience in +package managers and toolchains from his involvement in the MSYS2 project. ## conda-build 3 and the new compiler toolchain -There was a period of time where conda-forge and Continuum worked together closely, with conda-forge relying on Continuum to supply several core libraries. In its infancy, the `conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel packages working by reducing possibility of divergence. Just as there were binary compatibility issues with mixing packages from among the many Binstar channels, mixing packages from `defaults` with `conda-forge` could be fragile and frustrating. - -Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. These changes, among other compiler updates, began to make the CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent TensorFlow project, required cumbersome patching to work with the older toolchain, if they worked at all. - -There was strong pressure from the community to update the ecosystem (i.e. the toolchain, and implicitly everything built with it). There were two prevailing options. One was Red Hat's `devtoolset`. This used an older GCC version which statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not necessary on end user systems. The other was to build GCC ourselves, and to ship the newer `libstdc++` library as a conda package. This was a community decision, and it was split roughly down the middle. - -In the end, the community decided to take the latter route, for the sake of greater control over updating to the latest toolchains, instead of having to rely on Red Hat. One major advantage of providing our own toolchain was that we could provide the toolchain as a conda package instead of a system dependency, so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. - -The result of this overhaul crystallized in the `compiler(...)` Jinja function in `conda-build` 3.x and the publication of the GCC 7 toolchain built from source in `defaults` [^anaconda-compilers]. +There was a period of time where conda-forge and Continuum worked together +closely, with conda-forge relying on Continuum to supply several core libraries. +In its infancy, the `conda-forge` channel had far fewer packages than the +`defaults` channel. conda-forge's reliance on `defaults` was partly to lower +conda-forge's maintenance burden and reduce duplicate work, but it also helped +keep mixtures of conda-forge and `defaults` channel packages working by reducing +possibility of divergence. Just as there were binary compatibility issues with +mixing packages from among the many Binstar channels, mixing packages from +`defaults` with `conda-forge` could be fragile and frustrating. + +Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in +`libstdc++`. These changes, among other compiler updates, began to make the +CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent +TensorFlow project, required cumbersome patching to work with the older +toolchain, if they worked at all. + +There was strong pressure from the community to update the ecosystem (i.e. the +toolchain, and implicitly everything built with it). There were two prevailing +options. One was Red Hat's `devtoolset`. This used an older GCC version which +statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` +updates were not necessary on end user systems. The other was to build GCC +ourselves, and to ship the newer `libstdc++` library as a conda package. This +was a community decision, and it was split roughly down the middle. + +In the end, the community decided to take the latter route, for the sake of +greater control over updating to the latest toolchains, instead of having to +rely on Red Hat. One major advantage of providing our own toolchain was that we +could provide the toolchain as a conda package instead of a system dependency, +so we could now express toolchain requirements in our recipes and have better +control over compiler flags and behavior. + +The result of this overhaul crystallized in the `compiler(...)` Jinja function +in [^conda-build-3] and the publication of the GCC 7 toolchain +built from source in `defaults` [^anaconda-compilers]. Conda-build 3 also +introduced dynamic pinning expressions that made it easier to maintain +compatibility boundaries. ABI documentation from [^abilab] helped +establish whether a given package should be pinned to major, minor, or bugfix +versions. ## From `free` to `main` Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from here out. -As more and more conflicts with `free` channel packages occurred, conda-forge gradually added more and more of their own core dependency packages to avoid those breakages. At the same time, Anaconda was working on two contracts that would prove revolutionary. - -- Samsung wanted to use conda packages to manage their internal toolchains, and Ray suggested that this was complementary to our own internal needs to update our toolchain. Samsung's contract supported development to conda-build that greatly expanded its ability to support explicit variants of recipes. -- Intel was working on developing their own Python distribution at the time, which they based on Anaconda and added their accelerated math libraries and patches to. Part of the Intel contract was that Anaconda would move all of their internal recipes into public-facing GitHub repositories. - -Rather than putting another set of repositories (another set of changes to merge) in between internal and external sources, such as `conda-forge`, Michael and Ray pushed for a design where conda-forge would be the reference source of recipes. Anaconda would only carry local changes if they were not able to be incorporated into the conda-forge recipe for social, licensing, or technical reasons. The combination of these conda-forge based recipes and the new toolchain are what made up the `main` channel [^anaconda-5], which was also part of `defaults`. - -The `main` channel represented a major step forward in keeping conda-forge and Anaconda aligned, which equates to smooth operation and happy users. The joined recipe base and toolchain has sometimes been contentious, with conda-forge wanting to move faster than Anaconda or vice-versa. The end result has been a compromise between cutting-edge development and slower enterprise-focused development. +As more and more conflicts with `free` channel packages occurred, conda-forge +gradually added more and more of their own core dependency packages to avoid +those breakages. At the same time, Anaconda was working on two contracts that +would prove revolutionary. + +- Samsung wanted to use conda packages to manage their internal toolchains, and + Ray suggested that this was complementary to our own internal needs to update + our toolchain. Samsung's contract supported development to conda-build that + greatly expanded its ability to support explicit variants of recipes. This + became the major new feature set released in conda-build 3.x. +- Intel was working on developing their own Python distribution at the time, + which they based on Anaconda and added their accelerated math libraries and + patches to. Part of the Intel contract was that Anaconda would move all of their + internal recipes into public-facing GitHub repositories. + +Rather than putting another set of repositories (another set of changes to +merge) in between internal and external sources, such as `conda-forge`, Michael +and Ray pushed for a design where conda-forge would be the reference source of +recipes. Anaconda would only carry local changes if they were not able to be +incorporated into the conda-forge recipe for social, licensing, or technical +reasons. The combination of these conda-forge based recipes and the new +toolchain are what made up the `main` channel [^anaconda-5], which was also part +of `defaults`. + +The `main` channel represented a major step forward in keeping conda-forge and +Anaconda aligned, which equates to smooth operation and happy users. The joined +recipe base and toolchain has sometimes been contentious, with conda-forge +wanting to move faster than Anaconda or vice-versa. The end result has been a +compromise between cutting-edge development and slower enterprise-focused +development. @@ -98,6 +255,8 @@ The `main` channel represented a major step forward in keeping conda-forge and A ## References +[^abilab]: [ABI Laboratory](https://abi-laboratory.pro/) + [^activepython]: https://www.activestate.com/platform/supported-languages/python/ [^anaconda-5]: https://www.anaconda.com/blog/announcing-the-release-of-anaconda-distribution-5-0 @@ -124,6 +283,8 @@ The `main` channel represented a major step forward in keeping conda-forge and A [^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. +[^conda-build-3]: [Conda-build 3](https://github.com/conda/conda-build/tree/3.0.0) + [^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. [^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) From 16cc1303cfbd7b336524f40a0b5457167dd2ce04 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 12:56:02 +0200 Subject: [PATCH 29/33] fix links --- community/history.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/community/history.md b/community/history.md index fc07d52afc..2a904456e3 100644 --- a/community/history.md +++ b/community/history.md @@ -205,8 +205,8 @@ so we could now express toolchain requirements in our recipes and have better control over compiler flags and behavior. The result of this overhaul crystallized in the `compiler(...)` Jinja function -in [^conda-build-3] and the publication of the GCC 7 toolchain -built from source in `defaults` [^anaconda-compilers]. Conda-build 3 also +in `conda-build` 3.x [^conda-build-3] and the publication of the GCC 7 toolchain +built from source in `defaults` [^anaconda-compilers]. `conda-build` 3.x also introduced dynamic pinning expressions that made it easier to maintain compatibility boundaries. ABI documentation from [^abilab] helped establish whether a given package should be pinned to major, minor, or bugfix @@ -283,7 +283,7 @@ development. [^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. -[^conda-build-3]: [Conda-build 3](https://github.com/conda/conda-build/tree/3.0.0) +[^conda-build-3]: [`conda-build` 3](https://github.com/conda/conda-build/tree/3.0.0) [^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. From 6eaed629fbc3b8abec6eff096d2eb87be133df49 Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 12:59:02 +0200 Subject: [PATCH 30/33] fold at 90 --- community/history.md | 441 ++++++++++++++++++++++--------------------- 1 file changed, 230 insertions(+), 211 deletions(-) diff --git a/community/history.md b/community/history.md index 2a904456e3..0ab99d4c87 100644 --- a/community/history.md +++ b/community/history.md @@ -8,244 +8,239 @@ title: History ## Context of binary packaging for Python -conda-forge's origins are best understood in the context of Python packaging -back in the early 2010s. Back then, the installation of Python packages across -operating systems was very challenging, especially on Windows, as it often meant -compiling dependencies from source. +conda-forge's origins are best understood in the context of Python packaging back in the +early 2010s. Back then, the installation of Python packages across operating systems was +very challenging, especially on Windows, as it often meant compiling dependencies from +source. Python 2.x was the norm. To install it, you'd get the official installers from -Python.org, use the system-provided interpreter in Linux, or resort to options -like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or -Enthought's distributions (EPD, later Canopy) [^enthought] in macOS and Windows +Python.org, use the system-provided interpreter in Linux, or resort to options like +Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's +distributions (EPD, later Canopy) [^enthought] in macOS and Windows [^legacy-python-downloads]. -If you wanted to install additional packages, the community was transitioning -from `easy_install` to `pip`, and there was no easy way to ship or install -pre-compiled Python packages. An alternative to the Python eggs [^eggs] used by -`easy_install` wouldn't emerge until 2013 with the formalization of wheels -[^wheels]. These were useful for Windows, where Christoph Gohlke's exes and -wheels [^cgohlke],[^cgohlke-shutdown] were your only choice. +If you wanted to install additional packages, the community was transitioning from +`easy_install` to `pip`, and there was no easy way to ship or install pre-compiled +Python packages. An alternative to the Python eggs [^eggs] used by `easy_install` +wouldn't emerge until 2013 with the formalization of wheels [^wheels]. These were useful +for Windows, where Christoph Gohlke's exes and wheels +[^cgohlke],[^cgohlke-shutdown] were your only choice. -However, for Linux, you would have to wait until 2016, when [`manylinux` wheels -were introduced](https://peps.python.org/pep-0513/). Before then, PyPI wouldn't -even allow compiled Linux wheels and your only alternative was to compile every -package from source. +However, for Linux, you would have to wait until 2016, when [`manylinux` wheels were +introduced](https://peps.python.org/pep-0513/). Before then, PyPI wouldn't even allow +compiled Linux wheels and your only alternative was to compile every package from +source. As an example, take a look at the [PyPI download page for `numpy` -1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The -"Built Distributions" section only shows a few `.exe` files for Windows (!), and -some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded -until April 2016. There was no mention whatsoever of macOS. Now compare it to -[`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in -March 2016: wheels for all platforms! - -The reason why it is hard to find packages for a specific system, and why -compilation was the preferred option for many, is [binary compatibility][abi]. -Binary compatibility is a window of compatibility where each combination of -compiler version, core libraries such as `glibc`, and dependency libraries -present on the build machine are compatible on destination systems. - -Linux distributions achieve this by freezing compiler versions and library -versions for a particular release cycle. Windows achieves this relatively easily -because Python standardized on particular Visual Studio compiler versions for -each Python release. Where a Windows package executable was reliably -redistributable across versions of Windows, so long as Python version was the -same, Linux presented a more difficult target because it was (and is) so much -harder to account for all of the little details that must line up. +1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built +Distributions" section only shows a few `.exe` files for Windows (!), and some +`manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April +2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` +1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels +for all platforms! + +The reason why it is hard to find packages for a specific system, and why compilation +was the preferred option for many, is [binary compatibility][abi]. Binary compatibility +is a window of compatibility where each combination of compiler version, core libraries +such as `glibc`, and dependency libraries present on the build machine are compatible on +destination systems. + +Linux distributions achieve this by freezing compiler versions and library versions for +a particular release cycle. Windows achieves this relatively easily because Python +standardized on particular Visual Studio compiler versions for each Python release. +Where a Windows package executable was reliably redistributable across versions of +Windows, so long as Python version was the same, Linux presented a more difficult target +because it was (and is) so much harder to account for all of the little details that +must line up. ## The origins of `conda` In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference -[^anaconda-history]. Anaconda was a distribution of scientifically-oriented -packages, but did not yet have tools for managing individual packages. Later -that year, in September, Continuum released `conda` 1.0, the cross-platform, -language-agnostic package manager for pre-compiled artifacts -[^conda-changelog-1.0]. The motivation behind these efforts was to provide an -easy way to ship all the compiled libraries and Python packages that users of -the SciPy and NumPy stacks needed +[^anaconda-history]. Anaconda was a distribution of scientifically-oriented packages, +but did not yet have tools for managing individual packages. Later that year, in +September, Continuum released `conda` 1.0, the cross-platform, language-agnostic package +manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these +efforts was to provide an easy way to ship all the compiled libraries and Python +packages that users of the SciPy and NumPy stacks needed [^packaging-and-deployment-with-conda],[^lex-fridman-podcast]. -Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): +Travis Oliphant, on [Why I promote +conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013): -> [...] at the first PyData meetup at Google HQ, where several of us asked Guido -what we can do to fix Python packaging for the NumPy stack. Guido's answer was -to "solve the problem ourselves". We at Continuum took him at his word. We -looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our -past experience with EPD [Enthought Python Distribution]. We thought hard about -the fundamental issues, and created the conda package manager and conda -environments. +> [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we +> can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the +> problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, +> pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD +> [Enthought Python Distribution]. We thought hard about the fundamental issues, and +> created the conda package manager and conda environments. -Conda packages could not only ship pre-compiled Python packages across platforms -but were also agnostic enough to ship Python itself, as well as the underlying -shared libraries without having to statically vendor them. This was particularly -convenient for projects that relied on both compiled dependencies (e.g. C++ or -Fortran libraries) and Python "glue code". +Conda packages could not only ship pre-compiled Python packages across platforms but +were also agnostic enough to ship Python itself, as well as the underlying shared +libraries without having to statically vendor them. This was particularly convenient for +projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and +Python "glue code". By June 2013, conda was using a SAT solver and included the `conda build` tool -[^new-advances-in-conda] for community users outside of Continuum to build their -own conda packages. This is also when the first Miniconda release and -Binstar.org [^binstar], a site for hosting arbitrary user-built conda packages, -were announced. Miniconda provided a minimal base environment that users could -populate themselves, and Binstar.org gave any user an easy platform for -redistributing their packages. - -:::note Did you know -All of the conda tools have been free (BSD 3-clause) and Binstar/Anaconda.org was also free (as in beer, not open source), with some paid options on Binstar/Anaconda.org for more storage. +[^new-advances-in-conda] for community users outside of Continuum to build their own +conda packages. This is also when the first Miniconda release and Binstar.org +[^binstar], a site for hosting arbitrary user-built conda packages, were announced. +Miniconda provided a minimal base environment that users could populate themselves, and +Binstar.org gave any user an easy platform for redistributing their packages. + +:::note + +Did you know All of the conda tools have been free (BSD 3-clause) and +Binstar/Anaconda.org was also free (as in beer, not open source), with some paid options +on Binstar/Anaconda.org for more storage. + ::: -With `conda build` came along the concept of recipes [^early-conda-build-docs]. -The -[`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) -repository became _the_ central place where people would contribute their conda -recipes. This was separate from Anaconda's package recipes, which were private -at this point. While successful, the recipes varied in quality, and typically -only worked on one or two platforms. There was no CI for any recipes to help -keep them working. It was common to find recipes that would no longer build, and -you had to tweak it to get it to work. +With `conda build` came along the concept of recipes [^early-conda-build-docs]. The +[`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository +became _the_ central place where people would contribute their conda recipes. This was +separate from Anaconda's package recipes, which were private at this point. While +successful, the recipes varied in quality, and typically only worked on one or two +platforms. There was no CI for any recipes to help keep them working. It was common to +find recipes that would no longer build, and you had to tweak it to get it to work. -In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand]. +In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as +Anaconda Inc [^anaconda-rebrand]. ## How conda-forge came to be -By 2015, several institutes and groups were using Binstar/Anaconda.org to -distribute software packages they used daily: the [Omnia Molecular +By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute +software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], the UK Met Office supported [SciTools -project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the -[US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started -using it in July 2014 [^binstar-ioos]. Although each channel was building conda -packages, the binary compatibility between channels was unpredictable. - -In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil -Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar -channels for IOOS and SciTools, respectively. Phil had [implemented CI +project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US +Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in +July 2014 [^binstar-ioos]. Although each channel was building conda packages, the binary +compatibility between channels was unpredictable. + +In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson +([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS +and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) -and [special tooling](https://github.com/conda-tools/conda-build-all) to build -conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There -was also a healthy exchange of recipes between the two groups, often assisted by -members of other communities. For example, Christophe Gohlke and David -Cournapeau were instrumental in getting Windows builds of the whole SciPy stack -to work on AppVeyor. - -There was a lot of cross-pollination between projects/channels, but projects -were still working in separate repos, duplicating recipes, and with differing -build toolchains (so mixing channels was unpredictable). Given the success of -the `ContinuumIO/conda-recipes` repository, it became clear there was a demand -for high quality conda recipes and more efficient community-driven collaboration -under a single umbrella [^talkpython-conda]. On April 11th, 2015, `conda-forge` -was registered as a Github organization [^github-api-conda-forge] and an -Anaconda.org channel [^binstar-conda-forge]. +and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda +packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a +healthy exchange of recipes between the two groups, often assisted by members of other +communities. For example, Christophe Gohlke and David Cournapeau were instrumental in +getting Windows builds of the whole SciPy stack to work on AppVeyor. + +There was a lot of cross-pollination between projects/channels, but projects were still +working in separate repos, duplicating recipes, and with differing build toolchains (so +mixing channels was unpredictable). Given the success of the `ContinuumIO/conda-recipes` +repository, it became clear there was a demand for high quality conda recipes and more +efficient community-driven collaboration under a single umbrella [^talkpython-conda]. On +April 11th, 2015, `conda-forge` was registered as a Github organization +[^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge]. ## Meanwhile at Continuum -:::note Reminder -Continuum Analytics is now Anaconda, but this article tries to keep the company name contemporaneous with the state of the world. +:::note + +Reminder Continuum Analytics is now Anaconda, but this article tries to keep the +company name contemporaneous with the state of the world. + ::: -It's a little strange to describe Continuum's history here, but the company -history is so deeply intertwined with conda-forge that it is essential for a -complete story. During this time, Continuum (especially Ilan Schnell -([@ilanschnell](https://github.com/ilanschnell))) was developing its own -internal recipes for packages. Continuum's Linux toolchain at the time was based -on CentOS 5 and GCC 4.8. These details matter, because they effectively set the -compatibility bounds of the entire conda package ecosystem. - -The packages made from these internal recipes were available on the `free` -channel, which in turn was part of a metachannel named `defaults`. The -`defaults` channel made up the initial channel configuration for the Miniconda -and Anaconda installers. Concurrently, Aaron Meurer -([@asmeurer](https://github.com/asmeurer)) led the `conda` and `conda-build` -projects, contributed many recipes to the `conda-recipes` repository and built -many packages on his `asmeurer` binstar.org channel. - -Aaron left Continuum in late 2015, leaving the community side of the projects in -need of new leadership. Continuum hired Kale Franz -([@kalefranz](https://github.com/kalefranz)) to fill this role. Kale had huge -ambitions for conda, but `conda-build` was not as much of a priority for him. -Michael Sarahan ([@msarahan](https://github.com/msarahan)) stepped in to -maintain `conda-build`. +It's a little strange to describe Continuum's history here, but the company history is +so deeply intertwined with conda-forge that it is essential for a complete story. During +this time, Continuum (especially Ilan Schnell +([@ilanschnell](https://github.com/ilanschnell))) was developing its own internal +recipes for packages. Continuum's Linux toolchain at the time was based on CentOS 5 and +GCC 4.8. These details matter, because they effectively set the compatibility bounds of +the entire conda package ecosystem. + +The packages made from these internal recipes were available on the `free` channel, +which in turn was part of a metachannel named `defaults`. The `defaults` channel made up +the initial channel configuration for the Miniconda and Anaconda installers. +Concurrently, Aaron Meurer ([@asmeurer](https://github.com/asmeurer)) led the `conda` +and `conda-build` projects, contributed many recipes to the `conda-recipes` repository +and built many packages on his `asmeurer` binstar.org channel. + +Aaron left Continuum in late 2015, leaving the community side of the projects in need of +new leadership. Continuum hired Kale Franz ([@kalefranz](https://github.com/kalefranz)) +to fill this role. Kale had huge ambitions for conda, but `conda-build` was not as much +of a priority for him. Michael Sarahan ([@msarahan](https://github.com/msarahan)) +stepped in to maintain `conda-build`. In 2016, Rich Signell at USGS connected Filipe and Phil with Travis Oliphant at Continuum, who assigned Michael Sarahan to be Continuum's representative in -`conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) -joined the team at Continuum soon afterwards, bringing extensive experience in -package managers and toolchains from his involvement in the MSYS2 project. +`conda-forge`. Ray Donnelly ([@mingwandroid](https://github.com/mingwandroid)) joined +the team at Continuum soon afterwards, bringing extensive experience in package managers +and toolchains from his involvement in the MSYS2 project. ## conda-build 3 and the new compiler toolchain -There was a period of time where conda-forge and Continuum worked together -closely, with conda-forge relying on Continuum to supply several core libraries. -In its infancy, the `conda-forge` channel had far fewer packages than the -`defaults` channel. conda-forge's reliance on `defaults` was partly to lower -conda-forge's maintenance burden and reduce duplicate work, but it also helped -keep mixtures of conda-forge and `defaults` channel packages working by reducing -possibility of divergence. Just as there were binary compatibility issues with -mixing packages from among the many Binstar channels, mixing packages from -`defaults` with `conda-forge` could be fragile and frustrating. - -Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in -`libstdc++`. These changes, among other compiler updates, began to make the -CentOS 5 toolchain troublesome. Cutting edge packages, such as the nascent -TensorFlow project, required cumbersome patching to work with the older -toolchain, if they worked at all. +There was a period of time where conda-forge and Continuum worked together closely, with +conda-forge relying on Continuum to supply several core libraries. In its infancy, the +`conda-forge` channel had far fewer packages than the `defaults` channel. conda-forge's +reliance on `defaults` was partly to lower conda-forge's maintenance burden and reduce +duplicate work, but it also helped keep mixtures of conda-forge and `defaults` channel +packages working by reducing possibility of divergence. Just as there were binary +compatibility issues with mixing packages from among the many Binstar channels, mixing +packages from `defaults` with `conda-forge` could be fragile and frustrating. + +Around this point in time, [GCC 5 arrived][gcc-5] with a breaking change in `libstdc++`. +These changes, among other compiler updates, began to make the CentOS 5 toolchain +troublesome. Cutting edge packages, such as the nascent TensorFlow project, required +cumbersome patching to work with the older toolchain, if they worked at all. There was strong pressure from the community to update the ecosystem (i.e. the -toolchain, and implicitly everything built with it). There were two prevailing -options. One was Red Hat's `devtoolset`. This used an older GCC version which -statically linked the newer `libstdc++` parts into binaries, so that `libstdc++` -updates were not necessary on end user systems. The other was to build GCC -ourselves, and to ship the newer `libstdc++` library as a conda package. This -was a community decision, and it was split roughly down the middle. - -In the end, the community decided to take the latter route, for the sake of -greater control over updating to the latest toolchains, instead of having to -rely on Red Hat. One major advantage of providing our own toolchain was that we -could provide the toolchain as a conda package instead of a system dependency, -so we could now express toolchain requirements in our recipes and have better -control over compiler flags and behavior. - -The result of this overhaul crystallized in the `compiler(...)` Jinja function -in `conda-build` 3.x [^conda-build-3] and the publication of the GCC 7 toolchain -built from source in `defaults` [^anaconda-compilers]. `conda-build` 3.x also -introduced dynamic pinning expressions that made it easier to maintain -compatibility boundaries. ABI documentation from [^abilab] helped -establish whether a given package should be pinned to major, minor, or bugfix -versions. +toolchain, and implicitly everything built with it). There were two prevailing options. +One was Red Hat's `devtoolset`. This used an older GCC version which statically linked +the newer `libstdc++` parts into binaries, so that `libstdc++` updates were not +necessary on end user systems. The other was to build GCC ourselves, and to ship the +newer `libstdc++` library as a conda package. This was a community decision, and it was +split roughly down the middle. + +In the end, the community decided to take the latter route, for the sake of greater +control over updating to the latest toolchains, instead of having to rely on Red Hat. +One major advantage of providing our own toolchain was that we could provide the +toolchain as a conda package instead of a system dependency, so we could now express +toolchain requirements in our recipes and have better control over compiler flags and +behavior. + +The result of this overhaul crystallized in the `compiler(...)` Jinja function in +`conda-build` 3.x [^conda-build-3] and the publication of the GCC 7 toolchain built from +source in `defaults` [^anaconda-compilers]. `conda-build` 3.x also introduced dynamic +pinning expressions that made it easier to maintain compatibility boundaries. ABI +documentation from [^abilab] helped establish whether a given package should be pinned +to major, minor, or bugfix versions. ## From `free` to `main` -Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from here out. - -As more and more conflicts with `free` channel packages occurred, conda-forge -gradually added more and more of their own core dependency packages to avoid -those breakages. At the same time, Anaconda was working on two contracts that -would prove revolutionary. - -- Samsung wanted to use conda packages to manage their internal toolchains, and - Ray suggested that this was complementary to our own internal needs to update - our toolchain. Samsung's contract supported development to conda-build that - greatly expanded its ability to support explicit variants of recipes. This - became the major new feature set released in conda-build 3.x. -- Intel was working on developing their own Python distribution at the time, - which they based on Anaconda and added their accelerated math libraries and - patches to. Part of the Intel contract was that Anaconda would move all of their - internal recipes into public-facing GitHub repositories. - -Rather than putting another set of repositories (another set of changes to -merge) in between internal and external sources, such as `conda-forge`, Michael -and Ray pushed for a design where conda-forge would be the reference source of -recipes. Anaconda would only carry local changes if they were not able to be -incorporated into the conda-forge recipe for social, licensing, or technical -reasons. The combination of these conda-forge based recipes and the new -toolchain are what made up the `main` channel [^anaconda-5], which was also part -of `defaults`. - -The `main` channel represented a major step forward in keeping conda-forge and -Anaconda aligned, which equates to smooth operation and happy users. The joined -recipe base and toolchain has sometimes been contentious, with conda-forge -wanting to move faster than Anaconda or vice-versa. The end result has been a -compromise between cutting-edge development and slower enterprise-focused -development. +Here around 2017, Continuum renamed itself to Anaconda, so let's switch those names from +here out. + +As more and more conflicts with `free` channel packages occurred, conda-forge gradually +added more and more of their own core dependency packages to avoid those breakages. At +the same time, Anaconda was working on two contracts that would prove revolutionary. + +- Samsung wanted to use conda packages to manage their internal toolchains, and Ray + suggested that this was complementary to our own internal needs to update our + toolchain. Samsung's contract supported development to conda-build that greatly + expanded its ability to support explicit variants of recipes. This became the major + new feature set released in conda-build 3.x. +- Intel was working on developing their own Python distribution at the time, which they + based on Anaconda and added their accelerated math libraries and patches to. Part of + the Intel contract was that Anaconda would move all of their internal recipes into + public-facing GitHub repositories. + +Rather than putting another set of repositories (another set of changes to merge) in +between internal and external sources, such as `conda-forge`, Michael and Ray pushed for +a design where conda-forge would be the reference source of recipes. Anaconda would only +carry local changes if they were not able to be incorporated into the conda-forge recipe +for social, licensing, or technical reasons. The combination of these conda-forge based +recipes and the new toolchain are what made up the `main` channel [^anaconda-5], which +was also part of `defaults`. + +The `main` channel represented a major step forward in keeping conda-forge and Anaconda +aligned, which equates to smooth operation and happy users. The joined recipe base and +toolchain has sometimes been contentious, with conda-forge wanting to move faster than +Anaconda or vice-versa. The end result has been a compromise between cutting-edge +development and slower enterprise-focused development. @@ -263,9 +258,12 @@ development. [^anaconda-compilers]: https://www.anaconda.com/blog/utilizing-the-new-compilers-in-anaconda-distribution-5 -[^anaconda-history]: [The Early History of the Anaconda Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. +[^anaconda-history]: [The Early History of the Anaconda + Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, + 2018. -[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. +[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, + 2017. [^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. @@ -275,43 +273,64 @@ development. [^binstar-scitools]: https://anaconda.org/scitools, 2014. -[^binstar]: [SciPy 2013 Lightning Talks, Thu June 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). +[^binstar]: [SciPy 2013 Lightning Talks, Thu June + 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). -[^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), 2022. +[^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts + down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), + 2022. [^cgohlke]: https://www.cgohlke.com/, 2025. -[^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. +[^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of + conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda + Community #1, 2024. [^conda-build-3]: [`conda-build` 3](https://github.com/conda/conda-build/tree/3.0.0) -[^conda-changelog-1.0]: [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. +[^conda-changelog-1.0]: [`conda` 1.0 release + notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), + 2012. [^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) -[^early-conda-build-docs]: [Conda build framework documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. +[^early-conda-build-docs]: [Conda build framework + documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), + 2014. -[^eggs]: [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). +[^eggs]: [The Internal Structure of Python + Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). [^enthought]: https://docs.enthought.com/canopy/ [^github-api-conda-forge]: https://api.github.com/orgs/conda-forge -[^legacy-python-downloads]: [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). +[^legacy-python-downloads]: [Download Python for Windows (legacy + docs)](https://legacy.python.org/download/windows/). -[^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. +[^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific + Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman + Podcast #224, 2022. -[^new-advances-in-conda]: [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. +[^new-advances-in-conda]: [New Advances in + Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), + Ilan Schnell, 2013. -[^packaging-and-deployment-with-conda]: [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. +[^packaging-and-deployment-with-conda]: [Packaging and deployment with + conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), + Travis Oliphant, 2013. [^pythonxy]: https://python-xy.github.io/, 2015. -[^talkpython-conda]: [Guaranteed packages via Conda and Conda-Forge](https://talkpython.fm/episodes/show/94/guarenteed-packages-via-conda-and-conda-forge), 2016. +[^talkpython-conda]: [Guaranteed packages via Conda and + Conda-Forge](https://talkpython.fm/episodes/show/94/guarenteed-packages-via-conda-and-conda-forge), + 2016. -[^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. +[^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, + 2013. -[^wheels]: [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) +[^wheels]: [PEP 427 – The Wheel Binary Package Format + 1.0](https://peps.python.org/pep-0427/) From 2d96e5a9710446cecd65fe297e9b91a2a3c50eeb Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 13:01:06 +0200 Subject: [PATCH 31/33] mention build/host/run split Co-authored-by: h-vetinari --- community/history.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/community/history.md b/community/history.md index 0ab99d4c87..df580e8667 100644 --- a/community/history.md +++ b/community/history.md @@ -202,12 +202,12 @@ toolchain as a conda package instead of a system dependency, so we could now exp toolchain requirements in our recipes and have better control over compiler flags and behavior. -The result of this overhaul crystallized in the `compiler(...)` Jinja function in -`conda-build` 3.x [^conda-build-3] and the publication of the GCC 7 toolchain built from -source in `defaults` [^anaconda-compilers]. `conda-build` 3.x also introduced dynamic -pinning expressions that made it easier to maintain compatibility boundaries. ABI -documentation from [^abilab] helped establish whether a given package should be pinned -to major, minor, or bugfix versions. +The result of this overhaul crystallized in the `compiler(...)` Jinja function and the +`build`/`host`/`run` dependencies split in `conda-build` 3.x [^conda-build-3], and the +publication of the GCC 7 toolchain built from source in `defaults` [^anaconda-compilers]. +`conda-build` 3.x also introduced dynamic pinning expressions that made it easier to +maintain compatibility boundaries. ABI documentation from [^abilab] helped establish +whether a given package should be pinned to major, minor, or bugfix versions. ## From `free` to `main` From a3430e3cd47518b79cf7036a50162467e703744b Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 13:03:29 +0200 Subject: [PATCH 32/33] pre-commit --- community/history.md | 64 ++++++++++++++++++++++++-------------------- 1 file changed, 35 insertions(+), 29 deletions(-) diff --git a/community/history.md b/community/history.md index df580e8667..8ba5d3c860 100644 --- a/community/history.md +++ b/community/history.md @@ -34,8 +34,8 @@ source. As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The "Built Distributions" section only shows a few `.exe` files for Windows (!), and some -`manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April -2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` +`manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. +There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms! @@ -258,12 +258,11 @@ development and slower enterprise-focused development. [^anaconda-compilers]: https://www.anaconda.com/blog/utilizing-the-new-compilers-in-anaconda-distribution-5 -[^anaconda-history]: [The Early History of the Anaconda - Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, - 2018. +[^anaconda-history]: + [The Early History of the Anaconda + Distribution](http://ilan.schnell-web.net/prog/anaconda-history/), Ilan Schnell, 2018. -[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, - 2017. +[^anaconda-rebrand]: https://www.anaconda.com/blog/continuum-analytics-officially-becomes-anaconda, 2017. [^binstar-conda-forge]: https://anaconda.org/conda-forge, 2015. @@ -273,63 +272,70 @@ development and slower enterprise-focused development. [^binstar-scitools]: https://anaconda.org/scitools, 2014. -[^binstar]: [SciPy 2013 Lightning Talks, Thu June +[^binstar]: + [SciPy 2013 Lightning Talks, Thu June 27](https://youtu.be/ywHqIEv3xXg?list=PLYx7XA2nY5GeTWcUQTbXVdllyp-Ie3r-y&t=850). -[^cgohlke-shutdown]: [What to do when Gohlke's python wheel service shuts - down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), - 2022. +[^cgohlke-shutdown]: + [What to do when Gohlke's python wheel service shuts + down?](https://stackoverflow.com/questions/72581592/what-to-do-when-gohlkes-python-wheel-service-shuts-down), 2022. [^cgohlke]: https://www.cgohlke.com/, 2025. -[^chatting-ocefpaf]: [Filipe Fernandes on the Evolution of +[^chatting-ocefpaf]: + [Filipe Fernandes on the Evolution of conda-forge](https://www.youtube.com/watch?v=U2oa_RLbTVA), Chatting with the Conda Community #1, 2024. [^conda-build-3]: [`conda-build` 3](https://github.com/conda/conda-build/tree/3.0.0) -[^conda-changelog-1.0]: [`conda` 1.0 release - notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), - 2012. +[^conda-changelog-1.0]: + [`conda` 1.0 release + notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. [^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) -[^early-conda-build-docs]: [Conda build framework - documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), - 2014. +[^early-conda-build-docs]: + [Conda build framework + documentation](https://web.archive.org/web/20141006141927/http://conda.pydata.org/docs/build.html), 2014. -[^eggs]: [The Internal Structure of Python +[^eggs]: + [The Internal Structure of Python Eggs](https://setuptools.pypa.io/en/latest/deprecated/python_eggs.html). [^enthought]: https://docs.enthought.com/canopy/ [^github-api-conda-forge]: https://api.github.com/orgs/conda-forge -[^legacy-python-downloads]: [Download Python for Windows (legacy +[^legacy-python-downloads]: + [Download Python for Windows (legacy docs)](https://legacy.python.org/download/windows/). -[^lex-fridman-podcast]: [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific +[^lex-fridman-podcast]: + [Travis Oliphant: NumPy, SciPy, Anaconda, Python & Scientific Programming](https://www.youtube.com/watch?v=gFEE3w7F0ww&t=7596s), Lex Fridman Podcast #224, 2022. -[^new-advances-in-conda]: [New Advances in +[^new-advances-in-conda]: + [New Advances in Conda](https://web.archive.org/web/20140331190645/http://continuum.io/blog/new-advances-in-conda), Ilan Schnell, 2013. -[^packaging-and-deployment-with-conda]: [Packaging and deployment with +[^packaging-and-deployment-with-conda]: + [Packaging and deployment with conda](https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda), Travis Oliphant, 2013. [^pythonxy]: https://python-xy.github.io/, 2015. -[^talkpython-conda]: [Guaranteed packages via Conda and - Conda-Forge](https://talkpython.fm/episodes/show/94/guarenteed-packages-via-conda-and-conda-forge), - 2016. +[^talkpython-conda]: + [Guaranteed packages via Conda and + Conda-Forge](https://talkpython.fm/episodes/show/94/guarenteed-packages-via-conda-and-conda-forge), 2016. -[^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, - 2013. +[^technical-discovery]: https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html, 2013. -[^wheels]: [PEP 427 – The Wheel Binary Package Format +[^wheels]: + [PEP 427 – The Wheel Binary Package Format 1.0](https://peps.python.org/pep-0427/) From 25cde72c9c0e672f094126554e1241e9e1e44fca Mon Sep 17 00:00:00 2001 From: jaimergp Date: Tue, 15 Apr 2025 13:08:40 +0200 Subject: [PATCH 33/33] one more link to conda-build docs --- community/history.md | 11 +++++++---- 1 file changed, 7 insertions(+), 4 deletions(-) diff --git a/community/history.md b/community/history.md index 8ba5d3c860..068f635c79 100644 --- a/community/history.md +++ b/community/history.md @@ -204,10 +204,11 @@ behavior. The result of this overhaul crystallized in the `compiler(...)` Jinja function and the `build`/`host`/`run` dependencies split in `conda-build` 3.x [^conda-build-3], and the -publication of the GCC 7 toolchain built from source in `defaults` [^anaconda-compilers]. -`conda-build` 3.x also introduced dynamic pinning expressions that made it easier to -maintain compatibility boundaries. ABI documentation from [^abilab] helped establish -whether a given package should be pinned to major, minor, or bugfix versions. +publication of the GCC 7 toolchain built from source in `defaults` +[^conda-build-compiler-docs],[^anaconda-compilers]. `conda-build` 3.x also +introduced dynamic pinning expressions that made it easier to maintain compatibility +boundaries. ABI documentation from [^abilab] helped establish whether a given package +should be pinned to major, minor, or bugfix versions. ## From `free` to `main` @@ -293,6 +294,8 @@ development and slower enterprise-focused development. [`conda` 1.0 release notes](https://github.com/conda/conda/blob/24.7.1/CHANGELOG.md#100-2012-09-06), 2012. +[^conda-build-compiler-docs]: https://docs.conda.io/projects/conda-build/en/latest/resources/compiler-tools.html#anaconda-compiler-tools + [^conda-recipes-repo]: [ContinuumIO/conda-recipes](https://github.com/conda-archive/conda-recipes) [^early-conda-build-docs]: