You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
|[boto](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#Boto)| Official AWS SDK for Python. |
82
-
|[s3cmd](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#s3cmd)| Interacts with S3 through the command line. |
83
-
|[s3distcp](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#s3distcp)| Combines smaller files and aggregates them together by taking in a pattern and target file. S3DistCp can also be used to transfer large volumes of data from S3 to your Hadoop cluster. |
84
-
|[s3-parallel-put](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#s3-parallel-put)| Uploads multiple files to S3 in parallel. |
85
-
|[redshift](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#redshift)| Acts as a fast data warehouse built on top of technology from massive parallel processing (MPP). |
86
-
|[kinesis](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#kinesis)| Streams data in real time with the ability to process thousands of data streams per second. |
87
-
|[lambda](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#lambda)| Runs code in response to events, automatically managing compute resources. |
|[titanic](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/kaggle/titanic.ipynb)| Predicts survival on the Titanic. Demonstrates data cleaning, exploratory data analysis, and machine learning. |
77
+
|[churn-analysis](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/analyses/churn.ipynb)| Predicts customer churn. Exercises logistic regression, gradient boosting classifers, support vector machines, random forests, and k-nearest-neighbors. Discussion of confusion matrices, ROC plots, feature importances, prediction probabilities, and calibration/descrimination.|
|[titanic](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/kaggle/titanic.ipynb)| Predicts survival on the Titanic. Demonstrates data cleaning, exploratory data analysis, and machine learning. |
101
-
|[churn-analysis](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/analyses/churn.ipynb)| Predicts customer churn. Exercises logistic regression, gradient boosting classifers, support vector machines, random forests, and k-nearest-neighbors. Discussion of confusion matrices, ROC plots, feature importances, prediction probabilities, and calibration/descrimination.|
|[ts-not-mnist](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/1_notmnist.ipynb)| Learn simple data curation by creating a pickle with formatted datasets for training, development and testing in TensorFlow. |
91
+
|[ts-fully-connected](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/2_fullyconnected.ipynb)| Progressively train deeper and more accurate models using logistic regression and neural networks in TensorFlow. |
92
+
|[ts-regularization](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/3_regularization.ipynb)| Explore regularization techniques by training fully connected networks to classify notMNIST characters in TensorFlow. |
93
+
|[ts-convolutions](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/4_convolutions.ipynb)| Create convolutional neural networks in TensorFlow. |
94
+
|[ts-word2vec](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/5_word2vec.ipynb)| Train a skip-gram model over Text8 data in TensorFlow. |
95
+
|[ts-lstm](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/6_lstm.ipynb)| Train a LSTM character model over Text8 data in TensorFlow. |
96
+
|[deep dream](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/deep-dream/dream.ipynb)| Caffe-based computer vision program which uses a convolutional neural network to find and enhance patterns in images. |
|[validation](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/scikit-learn/scikit-learn-validation.ipynb)| Validation and model selection. |
123
118
124
-
<br/>
125
-
<palign="center">
126
-
<imgsrc="http://i.imgur.com/ZhKXrKZ.png">
127
-
</p>
128
-
129
-
## deep-learning
130
-
131
-
IPython Notebook(s) demonstrating deep learning functionality.
|[deep dream](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/deep-dream/dream.ipynb)| Caffe-based computer vision program which uses a convolutional neural network to find and enhance patterns in images. |
136
-
|[ts-not-mnist](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/1_notmnist.ipynb)| Learn simple data curation by creating a pickle with formatted datasets for training, development and testing in TensorFlow. |
137
-
|[ts-fully-connected](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/2_fullyconnected.ipynb)| Progressively train deeper and more accurate models using logistic regression and neural networks in TensorFlow. |
138
-
|[ts-regularization](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/3_regularization.ipynb)| Explore regularization techniques by training fully connected networks to classify notMNIST characters in TensorFlow. |
139
-
|[ts-convolutions](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/4_convolutions.ipynb)| Create convolutional neural networks in TensorFlow. |
140
-
|[ts-word2vec](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/5_word2vec.ipynb)| Train a skip-gram model over Text8 data in TensorFlow. |
141
-
|[ts-lstm](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/deep-learning/tensor-flow-exercises/6_lstm.ipynb)| Train a LSTM character model over Text8 data in TensorFlow. |
|[unit tests](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/python-data/unit_tests.ipynb)| Nose unit tests. |
|[boto](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#Boto)| Official AWS SDK for Python. |
209
+
|[s3cmd](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#s3cmd)| Interacts with S3 through the command line. |
210
+
|[s3distcp](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#s3distcp)| Combines smaller files and aggregates them together by taking in a pattern and target file. S3DistCp can also be used to transfer large volumes of data from S3 to your Hadoop cluster. |
211
+
|[s3-parallel-put](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#s3-parallel-put)| Uploads multiple files to S3 in parallel. |
212
+
|[redshift](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#redshift)| Acts as a fast data warehouse built on top of technology from massive parallel processing (MPP). |
213
+
|[kinesis](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#kinesis)| Streams data in real time with the ability to process thousands of data streams per second. |
214
+
|[lambda](http://nbviewer.ipython.org/github/donnemartin/data-science-ipython-notebooks/blob/master/aws/aws.ipynb#lambda)| Runs code in response to events, automatically managing compute resources. |
0 commit comments