You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: community/history.md
+13-13Lines changed: 13 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,40 +4,40 @@ title: History
4
4
5
5
# History of conda-forge
6
6
7
-
conda-forge's origins cannot be explained without understanding the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, specially on Windows, as it often meant compiling dependencies from source.
7
+
conda-forge's origins are best understood in the context of Python packaging back in the early 2010s. Back then, the installation of Python packages across operating systems was very challenging, especially on Windows, as it often meant compiling dependencies from source.
8
8
9
-
Python 2.x was the norm. To install it, you'd get the official installers from Python.org, stick to the systemprovided one in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads].
9
+
Python 2.x was the norm. To install it, you'd get the official installers from Python.org, use the system-provided interpreter in Linux, or resort to options like Python(x,y) [^pythonxy], ActiveState ActivePython [^activepython] or Enthought's distributions [^enthought] in macOS and Windows [^legacy-python-downloads].
10
10
11
-
If you wanted to install additional packages, you would find that the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. There wouldn't be an alternative for Python eggs [^eggs] until 2013, when wheels are formalized[^wheels]. These were useful for Windows, where Christoph Gohlke's wheels [^cgohlke]<sup>,</sup>[^cgohlke-shutdown] were your only choice.
11
+
If you wanted to install additional packages, the community was transitioning from `easy_install` to `pip`, and there was no easy way to ship or install pre-compiled Python packages. An alternative to Python eggs [^eggs]wouldn't emerge until 2013 with the formalization of wheels[^wheels]. These were useful for Windows, where Christoph Gohlke's wheels [^cgohlke]<sup>,</sup>[^cgohlke-shutdown] were your only choice.
12
12
13
-
However, for Linux, you would have to wait until 2016, when `manylinux` wheels were introduced. Before then PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source.
13
+
However, for Linux, you would have to wait until 2016, when `manylinux` wheels were introduced. Before then, PyPI wouldn't even allow compiled Linux wheels and your only alternative was to compile every package from source.
14
14
15
-
As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The build distributions section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. But if you pay attention, the `manylinux1` wheels were not uploaded until Apr 2016. No mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms!
15
+
As an example, take a look at the [PyPI download page for `numpy` 1.7.0](https://pypi.org/project/numpy/1.7.0/#files), released in Feb 2013. The build distributions section only shows a few `.exe` files for Windows (!), and some `manylinux1` wheels. However, the `manylinux1` wheels were not uploaded until April 2016. There was no mention whatsoever of macOS. Now compare it to [`numpy` 1.11.0](https://pypi.org/project/numpy/1.11.0/#files), released in March 2016: wheels for all platforms!
16
16
17
17
## The origins of `conda`
18
18
19
-
In 2012, Continuum Analytics announces Anaconda 0.8 in the SciPy conference [^anaconda-history]. Later that year, in September, Continuum would release `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and numpy stacks needed [^packaging-and-deployment-with-conda]<sup>,</sup>[^lex-fridman-podcast].
19
+
In 2012, Continuum Analytics announced Anaconda 0.8 at the SciPy conference [^anaconda-history]. Later that year, in September, Continuum released `conda` 1.0, the cross-platform, language-agnostic package manager for pre-compiled artifacts [^conda-changelog-1.0]. The motivation behind these efforts was to provide an easy way to ship all the compiled libraries and Python packages that users of the SciPy and NumPy stacks needed [^packaging-and-deployment-with-conda]<sup>,</sup>[^lex-fridman-podcast].
20
20
21
21
Travis Oliphant, on [Why I promote conda](https://technicaldiscovery.blogspot.com/2013/12/why-i-promote-conda.html) (2013):
22
22
23
23
> [...] at the first PyData meetup at Google HQ, where several of us asked Guido what we can do to fix Python packaging for the NumPy stack. Guido's answer was to "solve the problem ourselves". We at Continuum took him at his word. We looked at dpkg, rpm, pip/virtualenv, brew, nixos, and 0installer, and used our past experience with EPD. We thought hard about the fundamental issues, and created the conda package manager and conda environments.
24
24
25
-
Conda packages were not only able to ship pre-compiled Python packages across platforms. They were agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code".
25
+
Conda packages could not only ship pre-compiled Python packages across platforms but were also agnostic enough to ship Python itself, as well as the underlying shared libraries without having to statically vendor them. This was particularly convenient for projects that relied on both compiled dependencies (e.g. C++ or Fortran libraries) and Python "glue code".
26
26
27
-
By June 2013, conda is using a SAT solver and includes the `conda build` subcommand [^new-advances-in-conda]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, are announced. This meant that now any user could build their software stack as conda packages and redistribute them online at no cost.
27
+
By June 2013, conda was using a SAT solver and included the `conda build` subcommand [^new-advances-in-conda]. This is also when the first Miniconda release and Binstar.org [^binstar], the predecessor of the Anaconda.org channels, were announced. This meant that any user could build their software stack as conda packages and redistribute them online at no cost.
28
28
29
-
With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository was_the_
30
-
place where people would contribute their conda recipes. It was really successful, but the recipes were of various quality, and typically only worked on one or two platforms. There was a high chance that a recipe you found there would no longer build, and you had to tweak it to get it to work.
29
+
With `conda build` came along the concept of recipes [^early-conda-build-docs]. The [`ContinuumIO/conda-recipes`](https://github.com/conda-archive/conda-recipes) repository became_the_ central
30
+
place where people would contribute their conda recipes. While successful, the recipes varied in quality, and typically only worked on one or two platforms. It was common to find recipes that would no longer build, and you had to tweak it to get it to work.
31
31
32
32
In 2015, Binstar.org became Anaconda.org, and in 2017 Continuum Analytics rebranded as Anaconda Inc [^anaconda-rebrand].
33
33
34
34
## How conda-forge came to be
35
35
36
-
By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos].
36
+
By 2015, several institutes and groups were using Binstar/Anaconda.org to distribute software packages they used daily: the [Omnia Molecular Dynamics](https://github.com/omnia-md) project started as early as March 2014 [^binstar-omnia], the UK Met Office supported [SciTools project](https://scitools.org.uk/) joined in June 2014 [^binstar-scitools], the [US Integrated Ocean Observing System (IOOS)](http://www.ioos.noaa.gov/) started using it in July 2014 [^binstar-ioos].
37
37
38
-
In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) are maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, very often with the assistance of folks in other communities. For example, Christophe Gohlke and David Cournapeau were instrumental to get Windows builds of the whole SciPy stack working on AppVeyor.
38
+
In 2014, Filipe Fernandes ([@ocefpaf](https://github.com/ocefpaf)) and Phil Elson ([@pelson](https://github.com/pelson)) were maintaining the Binstar channels for IOOS and SciTools, respectively. Phil had [implemented CI pipelines](https://github.com/SciTools/conda-recipes-scitools/blob/995fc231967719db0dd6321ba8a502390a2f192c/.travis.yml) and [special tooling](https://github.com/conda-tools/conda-build-all) to build conda packages for SciTools efficiently, and Filipe borrowed it for IOOS. There was also a healthy exchange of recipes between the two groups, often assisted by members of other communities. For example, Christophe Gohlke and David Cournapeau were instrumental in getting Windows builds of the whole SciPy stack to work on AppVeyor.
39
39
40
-
It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it was obvious there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge`is registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge].
40
+
It was a successful collaborative effort, but it was inefficient since they were working in separate repos, duplicated recipes, etc. Given the success of the `ContinuumIO/conda-recipes` repository, it became clear there was a demand for high quality conda recipes and more efficient collaboration under a single umbrella. On April 11th, 2015, `conda-forge`was registered as a Github organization [^github-api-conda-forge] and an Anaconda.org channel [^binstar-conda-forge].
0 commit comments