@@ -190,10 +190,10 @@ def __init__(self, sparkcontext, credentials, configuration_name='', bucket_name
190
190
in DSX - Notebooks by clicking on the datasources palette then
191
191
choose the datasource you want to access then hit insert credentials.
192
192
193
- cos_id [optional]: this parameter is the cloud object storage unique id. It is useful
194
- to keep in the class instance for further checks after the initialization. However,
195
- it is not mandatory for the class instance to work. This value can be retrieved by
196
- calling the get_os_id function .
193
+ configuration_name [optional]: string that identifies this configuration. You can
194
+ use any string you like. This allows you to create
195
+ multiple configurations to different Object Storage accounts.
196
+ if a configuration name is not passed the default one will be used "service" .
197
197
198
198
bucket_name (projectId in DSX) [optional]: string that identifies the defult
199
199
bucket nameyou want to access files from in the COS service instance.
@@ -202,10 +202,6 @@ def __init__(self, sparkcontext, credentials, configuration_name='', bucket_name
202
202
If this value is not specified, you need to pass it when
203
203
you use the url function.
204
204
205
- NOTE: Hadoop configuration will be set with a service name equals to
206
- the value of cos_id. If cos_id is not set, the default service name "service"
207
- will be used.
208
-
209
205
'''
210
206
self .bucket_name = bucket_name
211
207
self .conf_name = configuration_name
0 commit comments