@@ -133,7 +133,7 @@ within a DSX Jupyter notebook, you can obtain your account credentials in the fo
133
133
If your Object Storage was created with a Softlayer account, each part of the credentials will
134
134
be found as text that you can copy and paste into the example code below.
135
135
136
- ### IBM Cloud Object Storage / Data Science Experience
136
+ ### Softlayer Cloud Object Storage / Data Science Experience
137
137
``` scala
138
138
import com .ibm .ibmos2spark .CloudObjectStorage
139
139
@@ -161,6 +161,62 @@ var dfData1 = spark.
161
161
load(cos.url(bucketName, objectname))
162
162
```
163
163
164
+ ### Bluemix Cloud Object Storage / Data Science Experience
165
+ The class CloudObjectStorage allows you to connect to bluemix cos. You can connect to
166
+ bluemix using api keys as follows:
167
+ ``` scala
168
+ import com .ibm .ibmos2spark .CloudObjectStorage
169
+
170
+ // The credentials HashMap may be created for you with the
171
+ // "insert to code" link in your DSX notebook.
172
+
173
+ var credentials = scala.collection.mutable.HashMap [String , String ](
174
+ " endPoint" -> " xxx" ,
175
+ " apiKey" -> " xxx" ,
176
+ " serviceId" -> " xxx"
177
+ )
178
+ var bucketName = " myBucket"
179
+ var objectname = " mydata.csv"
180
+
181
+ var configurationName = " cos_config_name" // you can choose any string you want
182
+ var cos = new CloudObjectStorage (sc, credentials, configurationName, " bluemix_cos" )
183
+ var spark = SparkSession .
184
+ builder().
185
+ getOrCreate()
186
+
187
+ var dfData1 = spark.
188
+ read.format(" org.apache.spark.sql.execution.datasources.csv.CSVFileFormat" ).
189
+ option(" header" , " true" ).
190
+ option(" inferSchema" , " true" ).
191
+ load(cos.url(bucketName, objectname))
192
+ ```
193
+ Alternatively, you can connect to bluemix cos using IAM token. Example:
194
+ ``` scala
195
+ import com .ibm .ibmos2spark .CloudObjectStorage
196
+
197
+ // The credentials HashMap may be created for you with the
198
+ // "insert to code" link in your DSX notebook.
199
+
200
+ var credentials = scala.collection.mutable.HashMap [String , String ](
201
+ " endPoint" -> " xxx" ,
202
+ " iamToken" -> " xxx" ,
203
+ " serviceId" -> " xxx"
204
+ )
205
+ var bucketName = " myBucket"
206
+ var objectname = " mydata.csv"
207
+
208
+ var configurationName = " cos_config_name" // you can choose any string you want
209
+ var cos = new CloudObjectStorage (sc, credentials, configurationName, " bluemix_cos" , " iam_token" )
210
+ var spark = SparkSession .
211
+ builder().
212
+ getOrCreate()
213
+
214
+ var dfData1 = spark.
215
+ read.format(" org.apache.spark.sql.execution.datasources.csv.CSVFileFormat" ).
216
+ option(" header" , " true" ).
217
+ option(" inferSchema" , " true" ).
218
+ load(cos.url(bucketName, objectname))
219
+ ```
164
220
165
221
### Bluemix / Data Science Experience
166
222
0 commit comments